X

Facebook's Zuckerberg speaks up about fake-news fixes

With the social network facing criticism for circulating sham news stories, CEO Mark Zuckerberg gives a glimpse of possible solutions.

Edward Moyer Senior Editor
Edward Moyer is a senior editor at CNET and a many-year veteran of the writing and editing world. He enjoys taking sentences apart and putting them back together. He also likes making them from scratch. ¶ For nearly a quarter of a century, he's edited and written stories about various aspects of the technology world, from the US National Security Agency's controversial spying techniques to historic NASA space missions to 3D-printed works of fine art. Before that, he wrote about movies, musicians, artists and subcultures.
Credentials
  • Ed was a member of the CNET crew that won a National Magazine Award from the American Society of Magazine Editors for general excellence online. He's also edited pieces that've nabbed prizes from the Society of Professional Journalists and others.
Edward Moyer
4 min read
Watch this: Facebook needs you to fight fake news
Facebook CEO Mark Zuckerberg

Facebook CEO Mark Zuckerberg: "We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties."

Getty Images

Here's some news that isn't fake.

Facebook CEO Mark Zuckerberg took to the social network Friday night to further address the outcry over bogus news stories appearing on the site.

In a post to his Facebook page, Zuckerberg said his company is committed to fixing the problem and outlined several possible solutions the social network is exploring.

"Normally we wouldn't share specifics about our work in progress," he wrote, "but given the importance of these issues and the amount of interest in this topic, I want to outline some of the projects we already have under way."

Among them:

  • creating technical tools for spotting fake news even before it's flagged by Facebook users,
  • working with third-party fact-checking groups to vet stories,
  • showing warnings alongside stories that have been flagged by Facebookers or third-party groups,
  • tweaking ad policies to discourage fake news stories linked to spam, and
  • consulting journalists about fact-checking techniques.

Zuckerberg called the problem a tricky one, with the company wanting to stop the spread of BS stories while at the same time avoiding censorship.

"We believe in giving people a voice," he wrote, "which means erring on the side of letting people share what they want whenever possible.

"We need to be careful not to discourage sharing of opinions or mistakenly restricting accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties."

Zuckerberg's post comes as Facebook, Google and others face a barrage of criticism for letting sham articles circulate by way of their sites.

During the recent US presidential election, a number of bogus articles did the rounds, including made-up stories like President Barack Obama banning the playing of the national anthem at US sporting events, and an FBI agent tied to the Hillary Clinton email scandal being found dead.

Some critics say the spread of stories like that on Facebook tipped the election, a charge Zuckerberg last week said was "pretty crazy," adding later that the social network had made progress in addressing fake news and would continue working at it.

Even Obama has weighed in on the issue.

"Particularly in the social media era, when so many get information from sound bites and snippets off their phone, if we can't discriminate between serious arguments and propaganda, then we have problems," the president warned during a press conference Thursday.

Zuckerberg's Friday post suggests he's aware the issue can't easily be dismissed.

"I want you to know that we have always taken this seriously," he wrote, "we understand how important the issue is for our community and we are committed to getting this right."

Zuckerberg urged patience, however, saying the search for a fix would involve trial and error and that some of the solutions being explored by Facebook might not pan out.

"Some of these ideas will work well," he wrote, "and some will not."

Here's the full text of the post:

A lot of you have asked what we're doing about misinformation, so I wanted to give an update.

The bottom line is: we take misinformation seriously. Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information. We've been working on this problem for a long time and we take this responsibility seriously. We've made significant progress, but there is more work to be done.

Historically, we have relied on our community to help us understand what is fake and what is not. Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others -- like people sharing links to myth-busting sites such as Snopes -- to understand which stories we can confidently classify as misinformation. Similar to clickbait, spam and scams, we penalize this content in News Feed so it's much less likely to spread.

The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or mistakenly restricting accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.

While the percentage of misinformation is relatively small, we have much more work ahead on our roadmap. Normally we wouldn't share specifics about our work in progress, but given the importance of these issues and the amount of interest in this topic, I want to outline some of the projects we already have underway:

- Stronger detection. The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.

- Easy reporting. Making it much easier for people to report stories as fake will help us catch more misinformation faster.

- Third party verification. There are many respected fact-checking organizations and, while we have reached out to some, we plan to learn from many more.

- Warnings. We are exploring labeling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.

- Related articles quality. We are raising the bar for stories that appear in related articles under links in News Feed.

- Disrupting fake news economics. A lot of misinformation is driven by financially motivated spam. We're looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.

- Listening. We will continue to work with journalists and others in the news industry to get their input, in particular, to better understand their fact-checking systems and learn from them.

Some of these ideas will work well, and some will not. But I want you to know that we have always taken this seriously, we understand how important the issue is for our community and we are committed to getting this right.

Podcast