X

Facebook launches new Transparency Center with data on content removal, government requests

The site will also house updates on how Facebook is responding to decisions by its content oversight board, including about former President Donald Trump.

Carrie Mihalcik Former Managing Editor / News
Carrie was a managing editor at CNET focused on breaking and trending news. She'd been reporting and editing for more than a decade, including at the National Journal and Current TV.
Expertise Breaking News, Technology Credentials
  • Carrie has lived on both coasts and can definitively say that Chesapeake Bay blue crabs are the best.
Carrie Mihalcik
2 min read
012-facebook-app-logo-on-phone-2021

Facebook released its Community Standards report for the first three months of 2021.

Sarah Tew/CNET

Facebook on Wednesday published a trove of information detailing its latest enforcement efforts on everything from hate speech to COVID-19 misinformation to counterfeit products. The company also introduced a new Transparency Center to house all of its reports. One thing the social network didn't have many details on: exactly how it plans to handle former President Donald Trump's suspension

Earlier this month, Facebook's content oversight board found that the social media giant should keep Trump's suspension in place but reconsider the length of time he was barred from the social network. The board, which is tasked with reviewing some of Facebook's most difficult content decisions, told Facebook to complete its review within six months. 

Facebook on Wednesday said it's working through the details of the implications of the board's many recommendations but didn't have a specific timeline. 

"The board upheld our decision, but they did not specify the appropriate duration of the penalty," said Monica Bickert, vice president of content policy at Facebook. "This is important work, we want to get it right."

Beyond Trump and other oversight board decisions, Facebook's new Transparency Center is intended to be a single destination for information around the social network's integrity and transparency efforts. 

Facebook released its Community Standards Enforcement Report for the first three months of 2021, which looks at the company's moderation efforts around hate speech, nudity, violent content and other harmful posts on its platforms. The company highlighted that it's able to "proactively detect about 97% of hate speech content" that gets removed from the social network. The company also for the first time shared information on content it proactively removed from Facebook and Instagram for potential counterfeit content or copyright infringement

Facebook also released its Transparency Report for the second half of 2020, which details government requests for user data, and it provided an update on its fight against COVID-19 misinformation. 

From the start of the pandemic to April 2021, the company said, it's removed more than 18 million pieces of COVID-related misinformation from Facebook and Instagram. Bickert said Facebook is focused on removing posts that directly contradict "prevailing health guidance and could also create a health risk," such as falsely saying some groups of people are immune to the virus. 

CNET's Rae Hodge contributed to this report.