X

Facebook now tells you exactly why it takes down posts

The social network is also expanding its appeals process around objectionable posts -- as Facebook tries to purge toxic content.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
4 min read
TOPSHOT-US-INTERNET-FACEBOOK

Facebook is giving its users more insight into what is and isn't allowed on the social network.

Josh Edelson/AFP/Getty Images

Facebook wants to be more transparent about what is and isn't allowed on the world's largest social network.

On Tuesday, the company released an updated version of its Community Standards guidelines -- the rules that dictate what's acceptable content for its 2.2 billion users to post.

Facebook's rules themselves haven't changed. What's new is the release of the comprehensive guidelines its content moderators use to handle objectionable material. Previously, users could only see surface-level descriptions of what they couldn't post. Now, the rules provide details on how Facebook deals with specific situations and defines certain terms.

For example, the social network says it defines a "mass murder" as a homicide that "results in 4 or more deaths in one incident." And in the section on harassment, Facebook says people can't send a message that "calls for death, serious disease or disability, or physical harm" or "claims that a victim of a violent tragedy is lying about being a victim."

Watch this: Cambridge Analytica is just the tip of the iceberg for Facebook's data issues

Facebook is also expanding its rules around appeals. Where before, you could request an appeal only if your Facebook profile, Page or Group was taken down, you now can challenge the social network about the removal of an individual piece of content. Users can also appeal Facebook's decision to preserve content they'd reported as a violation of the company's rules.

"These standards will continue to evolve as our community continues to grow," Monika Bickert, vice president of product policy and counterterrorism, said last week at a press briefing in Facebook's Menlo Park, California, headquarters. "Now everybody out there can see how we're instructing these reviewers."

Facebook has been in the hot seat since last month's scandal involving Cambridge Analytica, a digital consultancy that improperly accessed data on up to 87 million Facebook users without their consent. The controversy has put several of Facebook's policies and practices under the microscope.

Bickert says the new transparency around Facebook's Community Standards doesn't have anything to do with that controversy, however.

"I've been in this job for five years," Bickert said. "We've wanted to do this for that entire time."

After Facebook published the guidelines, the Anti-Defamation League applauded Facebook for its transparency, but said the company needs to go further. The organization wants Facebook to work with independent organizations and academic researchers "to open up Facebook's data around hate speech for study."

"It is imperative for Facebook to explain how hate content spreads on the platform, and how their policies are enforced in ways consistent with both internal standards and with the ethical standards of civil society," the ADL said in a statement. 

Hot seat

Facebook has been under pressure to clarify its moderation guidelines since the 2016 election, when Russian trolls abused Facebook with a combination of paid ads and organic posts to sow discord among American voters. Many conservatives have also criticized the platform for what they see as political bias.

When Mark Zuckerberg was grilled by Congress two weeks ago, lawmakers repeatedly asked him about what is -- and isn't -- allowed on Facebook.

Rep. David McKinley, a Republican from West Virginia, mentioned illegal listings for opioids posted on Facebook, and asked why they had not been taken down. Other Republican lawmakers asked why the social network had removed posts by Diamond and Silk, two African-American Trump supporters with 1.6 million Facebook followers.

In November, Facebook said it will add 20,000 content moderators, up from 10,000 last year. In his testimony to Congress, Zuckerberg said the real breakthrough will come when artificial intelligence tools will be able to proactively police the platform's content, although it will take "years" before that kind of technology will be good enough.

In the meantime, Bickert said Facebook's moderators do a good job overall of taking down inappropriate material. Still, some things fall through the cracks.

"We have millions of reports every week," Bickert said. "So even if we maintain 99 percent accuracy, there's still going to be mistakes made."

Facebook has also talked about further expanding its appeals process to include opinions of people outside the company. In an interview with Vox earlier this month, Zuckerberg mentioned the idea of a Facebook "Supreme Court," made up of independent members who don't work for the company. Their role would be to make the "final judgement call" on what's acceptable speech on Facebook.

Bickert didn't address that idea last week, but said the company is "always exploring new options" for appeals.

Facebook also said it wants community input on how its guidelines should evolve. In May, it's launching a new forum called Facebook Open Dialogue to get feedback on its policies. The first events will take place in Paris, Berlin and the UK. Events in the US, India and Singapore are planned for later this year.

First published April 23, 5:36 p.m. PT.
Update, April 24 at 10:36 a.m. PT: Adds statement from the Anti-Defamation League

Cambridge Analytica: Everything you need to know about Facebook's data mining scandal.

Tech Enabled: CNET chronicles tech's role in providing new kinds of accessibility.