X

Thrown Into Facebook Jail? Meta Says It Will Explain What You Did Wrong

The social network is changing a "strike" system that restricts users who violate their rules from posting on the platform.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
3 min read
Facebook logo above a phone screen with the Meta logo on it

Meta is the parent company of Facebook.

James Martin/CNET

Facebook users get frustrated when the social network blocks them from posting on the platform, especially when they don't understand what rules they violated. 

Now Facebook's parent company, Meta, says it plans to do a better job of explaining to Facebook users why it removed their post before the company resorts to longer account restrictions.

"The vast majority of people on our apps are well-intentioned," said Monika Bickert, Meta's vice president of content policy in a blog post on Thursday. "Historically, some of those people have ended up in 'Facebook jail' without understanding what they did wrong or whether they were impacted by a content enforcement mistake."

Bickert didn't provide a lot of detail about how Facebook plans to help users better understand why the social network pulled down their post. The company has been updating notifications and content rules.

Facebook currently applies "strikes" to a user's account when they break certain rules. Users can be blocked from creating content on Facebook for a day or as long as 30 days depending on how many strikes they have on their account. Facebook is changing this system so users would need more strikes to be blocked for a longer period of time from creating content on the platform.

Previously, Facebook users would be restricted from posting, commenting or creating other content on the social network for a day if they had two strikes on their account. Now Facebook says that the one-day restriction from creating content will occur when a user has seven strikes on their account. 

"Under the new system, we will focus on helping people understand why we have removed their content, which is shown to be more effective at preventing re-offending, rather than so quickly restricting their ability to post," Bickert said.

One strike will typically result in a warning without account restrictions. If a user has two to six strikes on their account, they will be blocked from using some features such as posting in a Facebook group for a "limited amount of time." Users will be restricted from posting on the site for three days if they have eight strikes on their account, seven days if they have nine strikes and 30 days if they have 10 or more strikes.

Facebook's changes show how the company is responding to criticism that it needs to do a better job of explaining how it moderates content to its 2 billion daily users. An oversight board tasked with reviewing Facebook's toughest content moderation decisions recommended Meta give users more details about its strike system. Facebook's 2020 civil rights audit, meanwhile, said that the social network "has also been criticized for lacking transparency or notice before penalties are imposed, and leaving users in 'Facebook jail' for extended periods seemingly out of nowhere." 

Facebook, which relies on both automated technology and humans to review content, has made mistakes. Some users such as anti-racism activists also grapple with their accounts being repeatedly reported for rule violations even when they didn't break any policies. The company said the new strike system will be more fair and effective.

"The implications of overenforcement are real — when people are unintentionally caught up in this system, they may find it hard to run their business, connect with their communities or express themselves," Bickert said. 

Meta also on Thursday released a quarterly report about how it enforces its rules, what it's doing to respond to the oversight board's recommendations and what content is popular on Facebook.