PBS and NPR for Southwest Florida
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Zuckerberg Denies Fake News On Facebook Had Impact On The Election

CEO Mark Zuckerberg defended Facebook against criticism that it doesn't vet fake news in its News Feed.
Lluis Gene
/
AFP/Getty Images
CEO Mark Zuckerberg defended Facebook against criticism that it doesn't vet fake news in its News Feed.

Mark Zuckerberg says the notion that fake news influenced the U.S. presidential election is "a pretty crazy idea."

The Facebook CEO is finding himself in a unique position in this election cycle. Many news organizations have come under fire for their coverage of the campaign. Now Facebook is getting it too, as a modern media company that does not vet fake news from its News Feed and that, critics argue, allows users to stay in information bubbles that reinforce existing prejudices.

Zuckerberg took both these criticisms head-on yesterday, at a conference called Techonomy. (You can find the full interview on his Facebook feed.)

He says hoaxes existed before his platform was created. They aren't new, and people who say misinformation is why Donald Trump won simply do not get it. "There's a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news," Zuckerberg says.

He also says his company has studied fake news and found it's a "very small volume" of the content on Facebook. He did not specify if that content is more or less viral or impactful than other information.

Weeks back Facebook algorithms accidentally promoted to "trending news" a fake story about Fox News anchor Megyn Kelly pledging her support for Democratic candidate Hillary Clinton.

Zuckerberg also said his team has studied the filter bubble effect and the research shows that almost everyone has some friend on the other side of the aisle. A Democrat may think there's no Republican in his News Feed, but there likely is. The social network's ability to connect people makes it "inherently more diverse" than the major news stations of 20 years back, Zuckerberg says.

He says right now the problem is not that diverse information isn't there. It's there more than it was in the days of traditional media. The problem, he says, is that people don't click on things that don't conform to their worldview. And, he says, "I don't know what to do about that."

Facebook is not a free speech platform. It has a long list of rules, called Community Standards, of things you're not allowed to say or share. Naked pictures of children are a well-known example.

During this election cycle, Zuckerberg personally intervened to change the rules. When Trump called for a ban on all Muslims, it was clearly hate speech, as defined by Facebook guidelines. But the CEO ordered his staff to not take it down because it was newsworthy.

That example is just one illustration of the extraordinary power Zuckerberg has amassed to decide what news is. And from his talk, it's clear that he's thinking about how to wield this power.

"When we started, the north star for us was: We're building a safe community," Zuckerberg says. He thought about how to control for bullies. One of the things that has shifted, he says, is that now news is a more important part of Facebook content. "We're still working through what that means."

Copyright 2020 NPR. To see more, visit https://www.npr.org.

Aarti Shahani is a correspondent for NPR. Based in Silicon Valley, she covers the biggest companies on earth. She is also an author. Her first book, Here We Are: American Dreams, American Nightmares (out Oct. 1, 2019), is about the extreme ups and downs her family encountered as immigrants in the U.S. Before journalism, Shahani was a community organizer in her native New York City, helping prisoners and families facing deportation. Even if it looks like she keeps changing careers, she's always doing the same thing: telling stories that matter.