The board wants Facebook parent Meta to improve a program known as cross check.
Facebook parent company Meta says its rules about what content is and isn't allowed on its platform such as hate speech and harassment apply to everyone.
But a board tasked with reviewing some of Meta's toughest content moderation decisions said Tuesday the social media giant's claim is "misleading."
In 2021, Meta asked the Oversight Board to look into a program called cross check that allows celebrities, politicians and other high-profile users on Facebook and Instagram to get an extra review if their content is flagged for violating the platform's rules. The Wall Street Journal revealed more details about the program last year, noting that the system shields millions of high-profile users from how Facebook typically enforces its rules. Brazilian soccer star Neymar, for example, was able to share nude photos of a woman who accused him of rape with tens of millions of his fans before Facebook pulled down the content.
In a 57-page policy advisory opinion about the program, the Oversight Board identified several flaws with Meta's cross check program, including that it gives some high-profile users more protection. The opinion also raises questions about whether Meta's program is working as intended.
"The opinion details how Meta's cross check program prioritizes influential and powerful users of commercial value to Meta and as structured does not meet Meta's human rights responsibilities and company values, with profound implications for users and global civil society," Thomas Hughes, director of the Oversight Board Administration, said in a statement.
Here's what you need to know about Meta's cross check program:
Meta says the cross check program aims to prevent the company from mistakenly taking action against content that doesn't violate its rules, especially in cases where there's a higher risk tied to making an error.
The company has said it's applied this program to posts from media outlets, celebrities or governments. "For example, we have Cross Checked an American civil rights activist's account to avoid mistakenly deleting instances of him raising awareness of hate speech he was encountering," Meta said in a blog post in 2018.
The company also provides more details about how the program works in its transparency center.
The board concluded the program results in "unequal treatment of users" because content that's flagged for additional review by a human stays on the platform for a longer time. Meta told the board the company can take more than five days to reach a decision on content from users who are part of cross check.
"This means that, because of cross check, content identified as breaking Meta's rules is left up on Facebook and Instagram when it is most viral and could cause harm," the opinion said.
The program also appears to benefit Meta's business interests more than it does its commitment to human rights, according to the opinion. The board pointed out transparency issues with the program. Meta doesn't tell the public who is on its cross-check list and fails to track data about whether the program actually helps the company make more accurate content moderation decisions.
The board asked Meta 74 questions about the program. Meta answered 58 of the questions fully and 11 partially. The company didn't answer five questions.
The board made 32 recommendations to Meta, noting it should prioritize content that's important for human rights and review those users in a separate workflow from its business partners. A user's follower numbers or celebrity status shouldn't be the sole factor for receiving extra protection.
Meta should also remove or hide highly severe content that's flagged for violating its rules during the first review while moderators take a second look at the post.
"Such content should not be allowed to remain on the platform accruing views simply because the person who posted it is a business partner or celebrity," the opinion said.
The board also wants Meta to be more transparent about the program by publicly marking some accounts protected by cross check such as state actors, political candidates and business partners so the public can hold them accountable for whether they're following the platform's rules. Users should also be able to appeal cross-checked content to the board.
The company said it's reviewing the board's opinion and will respond within 90 days.
Meta said in the past year it's worked on improving the program such as expanding cross-check reviews to all 3 billion users. The company said it uses an algorithm to determine if content has a higher risk of mistakenly getting pulled down. Meta also noted it established annual reviews to look at who is receiving an extra level of review.