Creating an independent board to review Facebook's decisions sounds like a sensible way to handle the social network's toughest calls on what content stays or goes. But the company is finding out that putting such a panel together will be a daunting challenge.
Facebook detailed some of the issues in a 44-page report Thursday, summing up feedback gathered around the world. The company said it spoke with roughly 900 people and reviewed more than 1,200 public comments about the proposed 40-person panel, which has been dubbed the Facebook "Supreme Court."
The report, called "Global Feedback & Input on the Facebook Oversight Board for Content Decisions," was accompanied by a video chat between CEO Mark Zuckerberg and Jennifer Martinez, dean of Stanford Law School, and Noah Feldman, a Harvard Law School professor who pitched the idea to Facebook last year.
Here are four takeaways from their discussion.
1. The board's role could become bigger in the future
Facebook doesn't just make decisions about what content to leave up or pull down. The social network also uses a bunch of signals like what posts you comment on or "like" to decide what it displays higher in your News Feed.
The board could have the power to influence Facebook's policies and how the content should be "treated" in the future, Zuckerberg said.
"There's a lot that this board could eventually do," Zuckerberg told his guests. "The goal is going to be to start narrowly and then eventually over time expand its scope and hopefully include more folks in the industry as well."
2. How quickly the board moves will be a big challenge
Facebook has faced criticism for not pulling down hate speech, bullying or misinformation quickly enough. Even Zuckerberg has acknowledged that the company should've acted more swiftly to prevent a doctored video of House Speaker Nancy Pelosi from spreading.
Moving quickly will be a "make or break" issue for the board's credibility, Martinez told Zuckerberg.
Facebook needs a way to refer cases to the board before the content goes viral, Zuckerberg said.
"It's not that we're ever going to be out of the business of having to make these decisions ourselves internally," he said.
3. International courts could provide a model for Facebook's content moderation board
Facebook crosses international borders, so the board will have to strike a balance between protecting the principles of free speech and the local laws that govern what is and isn't acceptable. That'll be tricky because some countries, notably the US, have an almost anything-goes approach, while others, such as Germany and France, curtail some forms of expression, such as hate speech.
Martinez, a scholar in human rights law who worked on the UN tribunal for the former Yugoslavia, says international courts could provide an example for the board to study.
She said international courts try to set a floor in terms of what all members are expected to protect. She cited the European Court of Human Rights, which hears cases in which a country is alleged to have breached civil or political rights. The court's floor, called the "margin of appreciation," allows it to balance fundamental principles while accommodating differences in local laws, cultures and needs, she said.
A similar setup for Facebook's content moderation board might help it weigh free speech principles and local legal and cultural issues, though tricky questions would still remain for countries that lean toward suppressing expression.
4. The board will have to prove through its actions that it's legitimate
Facebook needs to show the public that the board isn't just a fall guy for the social network when it makes a decision that sparks public backlash. Ultimately, that means the board will have to overturn decisions that Facebook previously made.
"Legitimacy ultimately ... will be real when people see decisions that are different from what Facebook would otherwise have decided to do," Feldman said.
Facebook also has to decide how it will select the board members, and some have suggested that it create a selection committee.
That still might not be enough to convince the public the board is independent, Feldman said.
"Maybe there's a hybrid solution, you know, where we can choose some of the people and then those people could participate alongside Facebook and external input on choosing the next set of people," Feldman said.
CNET's Andrew Morse contributed to this report.