X

Facebook's leaked rulebooks highlight struggle with content moderation

The New York Times uncovers 1,400 documents demonstrating Facebook's growing political influence and troubles with moderation.

Jackson Ryan Former Science Editor
Jackson Ryan was CNET's science editor, and a multiple award-winning one at that. Earlier, he'd been a scientist, but he realized he wasn't very happy sitting at a lab bench all day. Science writing, he realized, was the best job in the world -- it let him tell stories about space, the planet, climate change and the people working at the frontiers of human knowledge. He also owns a lot of ugly Christmas sweaters.
Jackson Ryan
3 min read
facebook-f8-2016-mark-zuckerberg-0071

CEO Mark Zuckerberg has some resolutions to write.

James Martin/CNET

In January, Facebook CEO Mark Zuckerberg, said he had a lot of work to do in 2018 if he wanted to fix Facebook. It seems that resolution will still be on the table in 2019. 

Scandal after scandal hit the social media giant this year. And now over 1,400 pages of leaked documents, obtained by The New York Times, has revealed rulebooks that the company uses to moderate the content on its platform, as well as how it polices posts and shortcomings of the 7,500-plus moderators who survey and control the posts from its 2 billion users.

The documents published Thursday by the Times are purportedly used to advise thousands of moderators about how to deal with any content that may be deemed problematic and "distill highly complex issues into simple yes-or-no rules." The moderation work is outsourced and the Times notes that some moderators rely on Google Translate to make split-second decisions on what is deemed hate speech or not. 

The Times reported that those workers are sometimes required to leave up posts that could lead to violence because they are unclear of the rules that "don't always make sense." 

The investigation was led by New York Times writer Max Fisher, who tweeted after the article was published that Facebook is making "many, many mistakes" in its efforts to control the types of content on its platform. Those mistakes include clerical errors that allowed an extremist group to continue posting on the platform in Myanmar and outdated guidelines written on "disorganized PowerPoint presentations" to curb a rising tide of animosity and nationalism in the Balkans.

Perhaps the most damning line in the article focuses on Facebook's growing political power as it experiments with different moderation techniques, bans and removal of content. 

"In an effort to control problems of its own creation, it has quietly become, with a speed that makes even employees uncomfortable, what is arguably one of the world's most powerful political regulators," according to the story.

The investigation also suggests that in places where Facebook faces extra government scrutiny, such as in Germany, the content moderation is much tighter than in places with less political oversight. "Its decisions often skew in favor of governments, which can fine or regulate Facebook," according to the Times.

News site Motherboard had previously reported on some of the internal documents included in The New York Times piece, such as the Facebook rules around "How to action on emojis" -- or what content moderators should do if they encounter any eggplants, poop or praying hands -- and the trouble the company has policing locally illegal content in countries such as India and Pakistan.

 As Zuckerberg said in early 2018, "The world feels anxious and divided, and Facebook has a lot of work to do." In 2019, there's still a lot more that Facebook needs to do to resolve the problems its existence has created.

Meet the women fighting fake news on Facebook

See all photos

Cambridge Analytica: Everything you need to know about Facebook's data mining scandal.

Tech Enabled: CNET chronicles tech's role in providing new kinds of accessibility.