Galaxy S23 Ultra: Hands-On Netflix Password-Sharing Crackdown Super Bowl Ads Apple Earnings Google's Answer to ChatGPT 'Knock at the Cabin' Review 'The Last of Us' Episode 4 Foods for Mental Health
Want CNET to notify you of price drops and the latest stories?
No, thank you
Accept

Facebook Parent Meta Shares Details About Newsworthy Posts It Leaves Up

Meta and Facebook's logos in front of a blue background
Facebook parent Meta released a quarterly report about how it enforces its content rules. 
James Martin/CNET

What's happening

Meta, Facebook's parent company, shared data for the first time about the number of times it's applied its newsworthiness allowance. The company will sometimes leave up posts that could violate its rules, if it determines the posts are newsworthy.

Why it matters

How Facebook balances newsworthiness with public safety has been an important question, especially ahead of the 2022 US midterm elections.

Facebook parent company Meta said that from June 2021 to June 2022 it made 68 "newsworthiness allowances" for pieces of content that might violate its rules.

It's the first time Meta has revealed how many times it's applied an exemption under which it leaves up newsworthy content that could break its rules. Facebook introduced this exemption in 2016 after the social network faced public backlash for removing an iconic Vietnam war photo of a naked girl fleeing a napalm attack. The company initially said the image violated its rules against child nudity, but it reinstated the photo after considering its historic significance.

How Facebook balances newsworthiness against the risk of public harm has been an important question, especially ahead of the 2022 US midterm elections. The company doesn't presume that any person's speech, including that of politicians, is inherently newsworthy. Meta said about 20%, or 13, of its newsworthiness allowances were issued for posts by politicians.

A semi-independent board that reviews the company's toughest content moderation decisions recommended that Facebook release data about its newsworthiness allowance. Known as the Oversight Board, the group operates separately from Facebook but does receive funding from Meta through a trust. The board made this recommendation in its decision to uphold the company's call to suspend Donald Trump, who was US President at the time, from the platform following the deadly Jan. 6 Capitol riots. 

Trump will be suspended from Facebook until at least January 2023, and the social network said it'll look to experts to assess whether the risk to public safety has declined. Trump is reportedly considering a 2024 presidential run.

Monika Bickert, Meta's vice president of content policy, said the company will look at instances of violence, restrictions on peaceful assembly and other markers of civil unrest. 

"If at that time we determine there's still a serious risk to public safety, then we will extend the restriction for a period of time and then we'll continue to evaluate," she said during a press conference Thursday.

In an update posted online, Meta shared examples of when it applied its newsworthiness allowance. The Ukrainian Defense Ministry shared a video that briefly showed an unidentified charred body. The company determined this video was newsworthy because it documented an ongoing armed conflict, even though Facebook typically removes such content under its policy against violent and graphic posts. Instead, Facebook placed a warning screen over this content and made it available only to users 18 and older.

Meta said it's also expanding the scope of the Oversight Board so the group can review cases about whether the social network should apply warning screens to content. Bloomberg reported that because of an informal recommendation by the board, Meta is also working on a customer service group to respond to users who had their accounts or posts removed unexpectedly.