The quasi-independent panel says it wants to "improve" the process.
Facebook says it can't keep up with the recommendations its oversight board is producing and wants changes to the process.
In a report released Tuesday, the giant social network says it wants to work with the quasi-independent body to "improve the recommendation process." The company, which recently rebranded as Meta, didn't detail what changes it might be seeking but said it had spoken with the board.
The company praised the oversight board, saying it had "pushed" Facebook to be more transparent about its operations. Based on the board's recommendations, Facebook has developed guidelines for satire, updated nudity detection to protect health-related posts and begun testing messaging for hate speech violations in languages other than English, the company said.
Still, the pace of recommendations was too quick for Facebook to keep up with.
"We believe the current design of the recommendation process may not be the best way to bring about the long-term, structural changes the board is pushing us to undertake," Facebook said in the report.
Monika Bickert, vice president of content policy at Facebook, declined to offer more details about what the company was seeking during a conference call with reporters. The board didn't immediately respond to a request for comment.
Facebook's desire to change its relationship with the oversight board comes as the company deals with one of the roughest patches in its 17-year history. The company has been the subject of a series of scathing stories in The Wall Street Journal and elsewhere based on leaked internal documents that suggest the company was aware of the harm its products were causing but prioritized profits. Frances Haugen, the whistleblower who leaked the documents, has testified at Congress and Parliament. The findings from the documents have revived scrutiny by US and UK lawmakers.
Critics of Facebook, which was used by Russia to influence the 2016 presidential election, say the company doesn't take its responsibility seriously enough and don't believe the oversight board moves fast enough or goes far enough. A group of vocal critics has set up a shadow organization, which it calls the Real Facebook Oversight Board.
To date, the board's highest-profile action was upholding Facebook's suspension of former President Donald Trump's Facebook and Instagram accounts. In May, the board said the social network was justified in suspending Trump amid concerns he could foment more violence after the deadly Capitol Hill riot on Jan. 6.
Here's what you need to know about Facebook's oversight board:
Let's get something straight: The oversight board doesn't do the same job as content moderators, who make decisions on whether individual posts to Facebook comply with the social network's rules. The board exists to support the "right to free expression" of Facebook's nearly 3 billion users.
The board functions a lot like a court, which isn't surprising given that a Harvard law professor came up with the idea. Users who believe content moderators have removed their posts improperly can appeal to the board for a second opinion. If the board sides with the user, Facebook must restore the post. Facebook can also refer cases to the board.
The oversight board can also make suggestions for changes to Facebook's policies. Over time, those recommendations could affect what users are allowed to post, which could make content moderation easier.
Facebook gets criticized by just about everybody for just about every decision it makes. Conservatives say the company and the rest of Silicon Valley are biased against their views. They point to the suspensions of Trump and right-wing extremist Alex Jones.
The social network doesn't get much love from progressives, either. They complain Facebook has become a toxic swamp of racist, sexist and misleading speech. Some progressive groups underlined their concerns in summer 2020 by calling on companies to avoid advertising on Facebook and publicizing the boycott with the hashtag #StopHateForProfit.
The oversight board can help Facebook deal with those complaints while lending credibility to the social network's community standards, a code of conduct that prohibits hate speech, child nudity and a host of other offensive content. By letting an independent board guide decisions about this content, Facebook hopes it will develop a more consistent application of its rules, which in the past have generated complaints for appearing arbitrary.
One example: Facebook's 2016 removal of an iconic Vietnam War photo that shows a naked girl fleeing a napalm attack. The company defended the removal, saying the Pulitzer Prize winning image violated its rules on child nudity. Facebook reversed its decision shortly afterward as global criticism mounted about the removal of a vital historical image.
It's no secret that Facebook has a trust problem. Regulators, politicians and the public all question whether the decisions the company makes serve its users or itself. Making the board independent of Facebook should, the company reckons, give people confidence that its decisions are being made on the merits of the situation, not on the basis of the company's interests.
In spring 2020, Facebook named the first 20 members of the board, a lineup that includes former judges and current lawyers, professors and journalists. It also includes a former prime minister and a Nobel Peace Prize winner. The board can be expanded to 40 people. The members have lived in nearly 30 countries and speak almost as many languages. About a quarter come from the US and Canada.
Serving on the board is a part-time job, with members paid through a multimillion-dollar trust. Board members will serve a three-year term. The board will have the power to select future members. It hears cases in panels of five members chosen at random.
Trump and conservatives were unhappy with the makeup of the board, which they saw as too liberal, according to The New Yorker. The former president even called CEO Mark Zuckerberg to express this sentiment, but Facebook didn't change the board members.
If you're skeptical, we hear you. Facebook doesn't have a great reputation for transparency.
That said, the charter establishing the board provides details of the efforts Facebook is taking to ensure the board's independence. For example, the board isn't a subsidiary of Facebook. It's a separate entity with its own headquarters and staff. It maintains its own website (in 18 languages, if you count US and UK English separately) and its own Twitter account.
Still, when it comes to money, the board is indirectly funded by Facebook through a trust. Facebook is funding the trust to the tune of $130 million, which it estimates will cover years of expenses.
Facebook says it will abide by the board's decisions even in cases when it disagrees with a judgment. (The social network says the only exceptions would be decisions that would force it to violate the law, an unlikely occurrence given the legal background of many board members.)
The board will also try to keep Facebook accountable, publishing an annual report that'll include a review of Facebook's actions as a result of its decisions.
Read more: Here's how you can submit an appeal to Facebook's oversight board.
Sure. The board decided in May that Facebook was justified in suspending Trump out of concern the former president could incite violence after he whipped up supporters as Congress gathered to certify Joe Biden's election. The decision, however, wasn't a blanket endorsement of Facebook's action, with the board taking exception to the open-ended nature of the penalty. The board said Facebook should reconsider the length of time that Trump was barred, and complete its review within six months.
The former president was kicked off Facebook and its Instagram photo-sharing service in the wake of the Jan. 6 Capitol Hill riot. Other social networks, including Twitter, also took action against Trump, who used their services to fan doubt over the legitimacy of the 2020 presidential election.
The case highlighted the difficult balance that private social media companies need to strike when handling political speech by public figures. Zuckerberg made the decision to ban Trump, who was still in office at the time, saying the risks of allowing Trump to continue posting were "simply too great," the Facebook boss said at the time.