Facebook's fight against fake news remains a bit of a mystery

We could tell you, but it would scuttle our investigations, a Facebook security expert tells reporters.

Laura Hautala
Laura Hautala
Laura Hautala Former Senior Writer
Laura wrote about e-commerce and Amazon, and she occasionally covered cool science topics. Previously, she broke down cybersecurity and privacy issues for CNET readers. Laura is based in Tacoma, Washington, and was into sourdough before the pandemic.
Expertise E-commerce, Amazon, earned wage access, online marketplaces, direct to consumer, unions, labor and employment, supply chain, cybersecurity, privacy, stalkerware, hacking. Credentials
  • 2022 Eddie Award for a single article in consumer technology
Laura Hautala
3 min read
James Martin/CNET

Facebook is fighting hard against misinformation that's coordinated and aims to manipulate the public. You know, the kind of thing the company says it didn't do enough to stop in 2016.

But in a phone press briefing Tuesday, a Facebook security expert wouldn't say whether the company is seeing the same kind of coordinated effort now.

"We know they're going to continue," said Nathaniel Gleicher, head of cybersecurity policy at Facebook. That much is inevitable, he said, without offering details. Even when the company knows of a misinformation campaign, Gleicher said, "we always have to be careful about compromising the investigation," he said after repeated questioning from reporters on the phone call.

What's more, the company stood firm on its refusal to remove fake news from its site -- even if a report promotes conspiracy theories that many people find offensive. A case in point from just this week: Facebook allowed unsubstantiated comments by InfoWars' Alex Jones, made against special prosecutor Robert Mueller in a Facebook livestream, to stand.

In other words, the call Tuesday showed Facebook again attempting to explain exactly what it will and won't do to combat the spread of false information on its platform. It's an issue that's dogged Facebook since US intelligence agencies announced they believe Russia ran a coordinated misinformation campaign leading up to the 2016 presidential election won by Donald Trump.

On the call, the company did talk about its efforts tied to the Mexican presidential election earlier this month. As part of its bid to keep misinformation from spreading, the company removed tens of thousands of fake "likes" from candidate pages and took down dozens of accounts impersonating candidates, said Diego Bassante, a manager on Facebook's Latin American Politics & Government team. It also worked with Verificado, a fact-checking organization, to identify misinformation online.

That mirrors the company's efforts around the world. Facebook is partnering with 27 fact-checking organizations in 17 different countries. What's more, it's applying the tools it already uses for cybersecurity to the problem of identifying coordinated misinformation campaigns.

The work involves taking down a lot of accounts -- more than a million accounts a day "at the point of creation," said Samidh Chakrabarti, a Facebook product manager for civic engagement and elections. The company didn't immediately provide information on how that compares to the number of accounts it took down daily in the past.

According Gleicher, that's Facebook's most important tool for stopping targeted misinformation campaigns. To get better at it, the company is combining machine learning to flag problem accounts with human investigators who examine the data and look for the true bad actors inside.

The company also works with government investigative bodies around the world, including those in Mexico, Brazil and the US.

False information can stay

Facebook treats false information differently than fake accounts. A post sharing a false news story won't get taken down like a fake account would.

As Tessa Lyons, Facebook product manager for News Feed said, "If you are who you say you are and you're not violating our community standards, we don't believe we should stop you from posting on Facebook."

The post will get marked as less relevant by Facebook's ranking algorithm, bumping it lower down on users' feeds. That lowers the chance for users to encounter it when it's posted by a Facebook friend or a page that they follow. What's more, if the story has been found by fact-checkers to be false, Facebook will alert users before they post it. 

Lyons didn't answer questions about whether it would ever take down a page like InfoWars, which has prompted outrage in general, as well as a lawsuit for its repeated denial of the shooting that killed schoolchildren in Newtown, Connecticut. Last week, the company defended its policy of leaving up the InfoWars page on Facebook

Ranking news stories as false can greatly diminish the number of people who see them, even if they remain posted on Facebook, Lyons said. It limits their future distribution by more than 80 percent.

Cambridge Analytica: Everything you need to know about Facebook's data mining scandal.

iHate: CNET looks at how intolerance is taking over the internet.