X

Facebook amps up 'fake news' antidotes. Plus: fact-checkers!

Your news feed will downplay items with "fake news" red flags, and Facebook will make it easier to report hoaxes. Humans might take closer looks now, too.

Joan E. Solsman Former Senior Reporter
Joan E. Solsman was CNET's senior media reporter, covering the intersection of entertainment and technology. She's reported from locations spanning from Disneyland to Serbian refugee camps, and she previously wrote for Dow Jones Newswires and The Wall Street Journal. She bikes to get almost everywhere and has been doored only once.
Expertise Streaming video, film, television and music; virtual, augmented and mixed reality; deep fakes and synthetic media; content moderation and misinformation online Credentials
  • Three Folio Eddie award wins: 2018 science & technology writing (Cartoon bunnies are hacking your brain), 2021 analysis (Deepfakes' election threat isn't what you'd think) and 2022 culture article (Apple's CODA Takes You Into an Inner World of Sign)
Joan E. Solsman
2 min read
Watch this: Facebook tackles fake news

Facebook plans to make it easier to report possible hoaxes, add warnings before you share a disputed article and downplay questionable stories in your news feed, the company said in a blog post Thursday.

And it's enlisting the help of a particularly powerful technology: real, trained human brains.

"It's important to us that the stories you see on Facebook are authentic and meaningful," the post reads. "We're excited about this progress, but we know there's more to be done. We're going to keep working on this problem for as long as it takes to get it right."

The changes respond to criticism that Facebook's news feed algorithms -- the software that picks the first posts you see -- sometimes fan the flames of "fake news" and allow misinformation to thrive. Blatantly false reports rose to their greatest prominence during the US election. Critics argue that hoax stories about the presidential candidates sowed confusion among voters, possibly impacting their decisions at the polls. Some surveys suggest the point about confusion may be true.

facebook-logotipo-conferencia-f82016.jpg

Facebook has 1.8 billion people using it at least once a month.

Claudia Cruz/CNET

Facebook on Thursday outlined measures it tested and is rolling out.

The company said it will make it easier to report a hoax by clicking on the upper right-hand corner of a post.

It kicked off a program with "third-party fact checking organizations" such as the AP and ABC News that have signed a code of principles laid out by Poynter, a nonprofit journalism institute. If fact-checking organizations identify a story as fake, the item gets flagged and triggers a link to a corresponding article explaining why.

Those disputed stories appear lower in news feed, and before you share a disputed story, you will see a warning. Once a story is flagged, it can't be promoted or turned into an ad.

In addition, Facebook said it was testing a system that would use engagement trends as signals that a post may be a hoax. "We've found that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way," the company said. Facebook will test whether to use this signal as a factor in news feed ranking.

Facebook last month updated its policy to include fake news among the kinds of sites with misleading or illegal content that it will refuse to display ads. But top executives had also been dismissive about the idea that Facebook had a decisive role in the election's result. Founder Mark Zuckerberg initially said last month that fake-news postings on Facebook influencing the election was "a pretty crazy idea."

Thursday, the company detailed other ways it would defang the financial rewards for fake news. It eliminated the ability to "spoof" domain names, which should cut down sites that are fake but pretend to be trusted publishers. It said it is also analyzing publisher sites to detect ones where its policies need more enforcement.