Facebook on Wednesday published a trove of information detailing its latest enforcement efforts on everything from to COVID-19 misinformation to counterfeit products. The company also introduced a new Transparency Center to house all of its reports. One thing the social network didn't have many details on: exactly how it plans to .
Earlier this month, Facebook's content oversight boardbut reconsider the length of time he was barred from the social network. The board, which is tasked with reviewing some of Facebook's most difficult content decisions, told Facebook to complete its review within six months.
Facebook on Wednesday said it's working through the details of the implications of the board's many recommendations but didn't have a specific timeline.
"The board upheld our decision, but they did not specify the appropriate duration of the penalty," said Monica Bickert, vice president of content policy at Facebook. "This is important work, we want to get it right."
Beyond Trump and other oversight board decisions, Facebook's new Transparency Center is intended to be a single destination for information around the social network's integrity and transparency efforts.
Facebook released its Community Standards Enforcement Report for the first three months of 2021, which looks at the company's moderation efforts around hate speech, nudity, violent content and other harmful posts on its platforms. The company highlighted that it's able to "proactively detect about 97% of hate speech content" that gets removed from the social network. The company also for the first time shared information on content it proactively removed from Facebook and Instagram for potential counterfeit content or copyright infringement.
Facebook also released its Transparency Report for the second half of 2020, which details government requests for user data, and it provided an update on its fight against COVID-19 misinformation.
From the start of the pandemic to April 2021, the company said, it's removed more than 18 million pieces of COVID-related misinformation from Facebook and Instagram. Bickert said Facebook is focused on removing posts that directly contradict "prevailing health guidance and could also create a health risk," such as falsely saying some groups of people are immune to the virus.
CNET's Rae Hodge contributed to this report.