Facebook's content oversight board on Thursday announced its first round of decisions, making rulings on five cases that involve hate speech, incitement of violence and other thorny topics for the social network. The board overturned four of Facebook's content moderation decisions, calling for posts to be restored, and upheld one.
"This is the first time that Facebook has been overruled on a content decision by the independent judgment of the oversight board and through our decisions we believe the board has an ability to provide a critical independent check on how Facebook moderates content and to begin reshaping the company's policies over the long term," Helle Thorning-Schmidt, the former Danish prime minister and co-chair of the oversight board, said in a press call.
The independent board was established last year to make the final call on some of Facebook's most difficult content decisions. It chose its first slate of cases for review in December, selecting six from more than 20,000 brought to the board since it opened its doors in late October 2020. Five of the cases were brought by users, while one was brought by the social media company itself. One case that was originally selected for review was thrown out after it "became unavailable for review by the Board as a result of user action."
The oversight board overturned Facebook's decisions to remove four posts under its community guidelines for hate speech, inciting violence, misinformation and adult nudity. Facebook said it has reinstated that content. The social network also plans to restore identical content tied to the board's decision but didn't provide an estimate of how many posts could get reinstated.
"Since we just received the board's decisions a short time ago, we will need time to understand the full impact of their decisions," Monika Bickert, who oversees Facebook's content policy, said in a blog post.
In one ruling, the board recommended Facebook create a new standard about health misinformation because it found the social network's rules addressing that content "to be inappropriately vague." A Facebook user in October posted a video and text in a group that criticized a French government agency's refusal to authorize hydroxychloroquine, a malarial drug, and another drug to treat COVID-19. Researchers have found hydroxychloroquine doesn't benefit adults hospitalized with the respiratory illness. The social network, which referred the case to the board, said it took down the post because it contained claimed a cure for the coronavirus exists and that could lead to "imminent...physical harm." The board disagreed that the post would cause imminent harm because the drugs require a prescription in France and the post wasn't encouraging people to buy or take these drugs without one.
In another ruling, the board overturned Facebook's decision to remove a post for violating its rules against hate speech. The Myanmar user had posted photos of a deceased child that included the phrase "[there is] something wrong with Muslims psychologically." The board, though, said Facebook needed to consider the context of the post and viewed the phrase as "commentary on the apparent inconsistency between Muslims' reactions to events in France and in China."
The decision sparked criticism from Muslim Advocates on Thursday, which accused the board of bending over backward to excuse anti-Muslim hate. "It is clear that the Oversight Board is here to launder responsibility for [Mark] Zuckerberg and Sheryl Sandberg. Instead of taking meaningful action to curb dangerous hate speech on the platform, Facebook punted responsibility to a third party board that used laughable technicalities to protect anti-Muslim hate content that contributes to genocide," Muslim Advocates spokesperson Eric Naing said in a statement.
The board upheld Facebook's decision to remove a November 2020 post that contained a demeaning slur to describe the Turkic ethnic group Azerbaijanis, saying that a majority of the board "found that the removal of this post was consistent with international human rights standards on limiting freedom of expression."
In January, the board said it would weigh in on deadly US Capitol riot on Jan. 6, saying his posts posed an unacceptable risk. The board hasn't yet released its ruling on this case. Public comments for that case opened Friday, and a panel has already started working on it. "Facebook argues that an indefinite ban does not provide certainty to Mr. Trump or the public as to the future treatment of his speech, but prioritizes safety in a period of civil unrest in the US with no set end date," the board said in a blog post on Friday.the accounts of former President Donald Trump. The social media giant blocked Trump's accounts on Facebook and Instagram following the
The board is also reviewing another case submitted by a user in the Netherlands. Facebook pulled down a 17-second video posted by the user in December for violating its rules against hate speech. In the video, a child meets two adults who have their faces painted black to portray Zwarte Piet, also referred to as Black Pete, who is a companion of Saint Nicholas, during a festival in the Netherlands. Facebook doesn't allow "caricatures of Black people in the form of blackface" but the user said the video was meant for their child.
Since October, more than 150,000 cases have been submitted to the board.