The social networks remove millions of posts, photos and videos every quarter for violating their rules against nudity, hate speech and other types of offensive content. If you're affected, you can ask Facebook and Instagram to review the decision, but that doesn't guarantee a reversal.
Now you have another option. Starting Thursday, you can askto take another look at your case. If the board rules in your favor, the content will be restored. In the coming months, users will also be able to appeal to the board about content they think should have been removed. The board is made up of 20 experts and civic leaders, including the former prime minister of Denmark, a Nobel Peace Prize laureate, law professors and journalists.
"The oversight board wasn't created to be a quick fire or an all-encompassing solution, but to offer a critical independent check on Facebook's approach to moderating some of the most significant content issues," Helle Thorning-Schmidt, the former prime minister of Denmark, who co-chairs the oversight board, said in a press conference Thursday.
Here's how it works: After a final removal decision is made, you could receive a message in your support inbox within the Facebook or Instagram app that includes an oversight board reference ID. If you receive this ID, your post is eligible for review by the board. You'll have 15 days to submit an appeal.
To do so, you'll need to visit the oversight board's website and click "Start Submission" located in bottom of the Appeals Process section. You'll be asked to log in to your Facebook or Instagram account, depending on where you posted the content. After entering your reference ID and consenting to how your information could be used, you'll answer questions about why you posted, why you're appealing and why you think the decision was wrong. After submitting your case to the board, you can track updates. If your case is chosen for review, the oversight board will issue a public explanation on its ruling. (Users can ask the board not to share personally identifiable information about them.)
Only some Facebook and Instagram content will be eligible for review by the oversight board. For example, the board won't be reviewing ads or direct messages. The board also won't be taking another look at child exploitation images, because reinstating the photos could be illegal. The changes are being rolled out over time, because Facebook wants to ensure the products for the board and users are stable. That means you may have to wait before this option's available to you.
Facebook's sites have billions of monthly active users globally and a staggering volume of content, so you'll probably have a tough time getting the board to review your case. The board can also consider cases referred by Facebook and will have to weigh whether they're significant, globally relevant and could impact the social network's future policy. The board is prioritizing cases it thinks could affect a large number of users, are important to public discourse or that raise key questions about Facebook's policies.
"I don't think there's any question that -- even though [the board] will take only a handful of cases in the beginning -- that those cases have the power and potential to be spectacularly influential into the world," Brent Harris, who oversees governance at Facebook, said in an interview.
To help the board review submitted cases, Facebook built a tool that lets the board track what's submitted and sort the cases based on topics. The board will see how many times a piece of content has been reported, as well as other information about the case.
It could also take some time before the board decides on a case. It has up to 90 days to uphold or overturn a removal decision. Facebook can submit a case to the board for expedited review, which could take up to 30 days.
Thorning-Schmidt said during the call that Facebook has been criticized for "moving fast and breaking things" but that the board wants to be the opposite of that and look at long-term issues.
Facebook's decisions to leave up or pull down content have sparked more scrutiny ahead of the US presidential election. Civil rights activists and lawmakers have criticized the company for not doing enough to remove hate speech. At the same time, Republicans say Facebook is suppressing their content to sway the outcome of the election -- allegations the company repeatedly denies.
Facebook grappled with more political pressure after it limited the reach of a New York Post article about Democratic presidential nominee Joe Biden's son, as it was being fact-checked. That reignited concerns about anti-conservative bias, and Senate Republicans called on Facebook CEO Mark Zuckerberg, as well as Twitter CEO Jack Dorsey, to testify about the issue.
Harris said the company could ask the board to issue a policy advisory opinion related to how the social network handled the New York Post article. Facebook is just starting the process of referring cases to the board and didn't have any more details about what cases the company is thinking about submitting for review.
"The scope for this board is broad and has been built to really address a wide array of difficult content decisions," Harris said.