X

Why did Facebook pull your post? Digital rights groups want you to know

The Electronic Frontier Foundation and more than 70 other groups say Facebook should allow all users to appeal if their posts are removed.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
2 min read
facebook-f8-mark-zuckerberg-2018-0234

Facebook CEO Mark Zuckerberg is getting a letter.

James Martin/CNET

When it comes to taking down content, Facebook acknowledges it hasn't always made the right decisions.

The tech firm apologized for removing the iconic "Napalm Girl" photo in 2016 and has grappled with other public blunders.

But as social networks face more pressure from lawmakers and users to pull down hate speech, terrorist content, nudity and other offensive items, digital rights groups say Facebook should improve its appeal process and share more data about removed posts.

On Tuesday, the Electronic Frontier Foundation, the Center for Democracy and Technology, the American Civil Liberties Union and more than 70 other groups sent a letter to Facebook, urging the company to allow all users to appeal if their posts are removed.

In April, the tech firm started allowing users to appeal if their posts were pulled down for nudity/sexual activity, hate speech or graphic violence. But the groups say that Facebook's appeals process "doesn't go far enough." All users should have the option to appeal and the decisions should be made by a human reviewer not a computer, the groups said in the letter.

They also want Facebook to share more data detailing why and how a post was pulled down, error rates and the number of successful appeals. The letter references principles the groups think that Facebook should include in their content moderation rules.

"We know from years of research and documentation that human content moderators, as well as machine learning algorithms, are prone to error, and that even low error rates can result in millions of silenced users when operating at massive scale," the letter said.

A Facebook representative signaled that the company is open to doing more.

"These are very important issues. It's why we launched an appeals process on Facebook in April, and also published our first transparency report on our effectiveness in removing bad content. We are one of the few companies to do this -- and we look forward to doing more in the future," the spokesperson said in a statement.

Nate Cardozo, an EFF senior staff attorney, said that digital rights groups have demanded that Facebook make those changes in the past but the company has been "dragging their heels."

"Facebook fundamentally thinks they're doing a good job with content moderation and they could not be more wrong," he said.

Below is the text of the letter sent to Zuckerberg:

First published, Nov. 13, 6 a.m. PT.
Update, 9:51 a.m. PT: Adds statement from Facebook.

The Honeymoon Is Over: Everything you need to know about why tech is under Washington's microscope.

Infowars and Silicon Valley: Everything you need to know about the tech industry's free speech debate.