X

Facebook may have built then ditched an update to fight fake news

Still under fire after the stunning results of last week's election, the social network could have released tools to combat bogus news stories, according to one report. What stopped it? A potential conservative backlash.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
3 min read
f8-facebook-mark-zuckerberg-0086.jpg

Facebook may have shelved a News Feed update that would have cracked down on fake news.

James Martin/CNET

Did Facebook help get Donald Trump elected?

CEO Mark Zuckerberg said last week that it's a "pretty crazy idea" to think his social network might have swayed the vote by letting fake news proliferate on its site. That came after numerous allegations that the fake news shared on the social network helped Trump win.

Now there's a report that Facebook could have helped combat that fake news with an update to its News Feed. The thing is, Zuckerberg and his team never released it, according to a report published Monday by Gizmodo, citing two unnamed sources they say have firsthand knowledge with Facebook's plans.

The software update would have been used in "downgrading and removing" fake news stories from people's Facebook feeds, according to Gizmodo. The update was apparently shelved because it would have disproportionately blocked out supposed stories from right-wing news sites, and Facebook didn't want to give the impression it was politically biased.

So by trying not to be politically biased, did Facebook actually end up favoring the Republican nominee?

"We did not build and withhold any News Feed changes based on their potential impact on any one political party," a Facebook spokesman said Monday, adding that the Gizmodo story is "not true."

So why would Facebook even be worried that it looked biased? That reportedly stems from a controversy earlier this year, in which Facebook was accused of encouraging its human staffers to hide conservative news from its "trending stories" feature. After that, Facebook did a review of all its products to make sure there wasn't any appearance of political bias, according to Gizmodo.

"They absolutely have the tools to shut down fake news," one source told Gizmodo.

That bogus news includes a story that claimed an FBI agent associated with the Hillary Clinton email leaks was found dead in a murder suicide (didn't happened). Another fake story said the Pope endorsed Trump (again, nope).

The stakes are so high because Facebook, with its 1.79 billion users, is a big source for news in the US. Over 40 percent of American adults get their news from Facebook, according to the Pew Research Center and Knight Foundation. So Facebook, with its mighty algorithms that determine what you see or don't see in your News Feed, faces tough questions about its responsibility in informing the public -- or more specifically in this case, the voting electorate.

It's not only a Facebook problem. Google on Monday responded to reports that a fake story about Trump winning the popular vote was showing up at the top of its search engine. "In this case we clearly didn't get it right, but we are continually working to improve our algorithms," a Google spokeswoman said.

For its part, Facebook says the company "continuously review[s] updates to make sure we are not exhibiting unconscious bias."

The denial comes after Zuckerberg last week defended Facebook and its role in the election. He said it took a "profound lack of empathy" to think someone would choose how to vote based on fake news. Over the weekend, he posted a lengthy note on his Facebook page, reiterating that fake news is just a small part of Facebook content. He did say, though, that Facebook still has work to do.

"We have made progress, and we will continue to work on this to improve further," he said.

He also said Facebook needs to "proceed very carefully" in taking on a role where it is expected to identify "truth." (Quotes around truth are his.) "I believe we must be extremely cautious about becoming arbiters of truth ourselves," he wrote.

Still, there's been reflection among Facebook's executives about what possible role the company played in shaping the opinions of votes of Americans, according to a report by The New York Times. The top brass called a meeting with Facebook's policy team and decided the company should discuss the issue at its quarterly all-hands meeting, the report said. That meeting happened last week, a Facebook spokesman said.

But even as other top Facebook executives questioned the social network's role, Zuckerberg, in private, remained steadfast in the idea that Facebook could "not unduly affect the way people think and behave," the Times wrote.

Zuckerberg wasn't available for comment on Monday.