Facebook's content oversight board includes former prime minister, Nobel winner

The social network named the first 20 members of a board that will review its content moderation decisions.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
4 min read

Facebook's new content oversight board has its first 20 members.

Angela Lang/CNET

Facebook unveiled on Wednesday the first 20 members of a new oversight board that'll review some of the social network's toughest content moderation decisions, an experiment that could change what stays up or gets removed from the site.

The board includes a mix of professors, lawyers, former judges, journalists and digital rights advocates, as well as a former prime minister of Denmark and a Nobel Peace Prize winner. The members have lived in more than 27 countries and speak at least 29 languages. About a quarter of the board members are from the US or Canada. All the members, Facebook said, are committed to freedom of expression.

"The announcement of the first group of oversight board members marks the beginning of a fundamental change in the way some of the most difficult and significant decisions about content on Facebook will be made," said Brent Harris, who oversees governance and global affairs at the social network.

Facebook has faced scrutiny from lawmakers, journalists and advocacy groups for decisions about which content it leaves up or removes. The new board could give the social network a way to fend off this criticism, and feedback to improve its community standards. With 2.6 billion monthly active users globally, Facebook said, the board will prioritize cases that affect a large number of people, fuel public debate or threaten someone's safety or equality. While the board is expected to hear only dozens of cases in its first year, it'll be able to make policy recommendations to Facebook, which could affect all users.

"We are not the internet police," said Michael McConnell, a former US federal circuit judge and constitutional law professor at Stanford Law School, who's a co-chair of the oversight board. "Don't think of us as sort of a fast action group that's going to swoop in and deal with rapidly moving problems."

The board's job, he said, is to consider appeals and take a second look at Facebook's content decisions to advance "fairness and neutrality in decision making." 

Some of Facebook's most controversial content moderation decisions have involved political speech. The social network has repeatedly denied allegations that it censors conservative speech. Conservative commentators and politicians have accused the social network of being biased against right-wing views, citing the bans of provocateurs Alex Jones and Milo Yiannopoulos. 

Adding to the confusion, the company has also reversed content decisions in the past following public backlash. In 2016, Facebook pulled down an iconic Vietnam War photo of a girl fleeing a napalm attack. The company defended the removal, saying the Pulitzer Prize winning photo violated its rules on child nudity, but it changed its mind because of the image's historical importance.

Helle Thorning-Schmidt, who served as Denmark's prime minister from 2011 to 2015, said social media can help us stay connected but that it also has its downsides. Thorning-Schmidt is a co-chair of Facebook's new oversight board.

"Social media can spread speech that is hateful, deceitful and harmful," she said. "And until now, some of the most difficult decisions around content have been made by Facebook, and you could say ultimately by Mark Zuckerberg ."

Zuckerberg, Facebook's CEO and co-founder, first announced plans to create an oversight board in 2018. The idea was pitched to Facebook by Noah Feldman, a professor at Harvard Law School who specializes in constitutional studies. Advocates who pressure social media companies to ban certain speech, he told Facebook, have a good chance of getting this content removed but "no one is pushing hard on the other side of the door."

One of Facebook's biggest challenges will be convincing the public that the board is truly independent from the social network. Facebook said it'll follow the board's decisions even if it disagrees with the result, unless doing so would violate the law. The board will publish an annual report that'll include what Facebook has done as a result of its decisions.

"Of course, it'll be very embarrassing for Facebook if they don't live up to their end of this bargain," Thorning-Schmidt said.

Other Facebook oversight board members include Tawakkol Karman, a journalist, civil rights activist and the first Arab woman to win a Nobel Peace Prize; Alan Rusbridger, the former editor-in-chief of The Guardian; Emi Palmor, former director general of the Israeli Ministry of Justice; and Katherine Chen, communications scholar at the National Chengchi University in Taiwan.

Columbia Law Professor Jamal Greene and Catalina Botero-Marino, a Colombian attorney who served as Special Rapporteur for Freedom of Expression for the Inter-American Commission on Human Rights, are also co-chairs of Facebook's oversight board.

The board could grow to 40 members. The members will serve part time for a three-year term and be paid through a multimillion dollar trust. The board will have the power to select future members. It'll hear cases in panels of five people that'll be chosen at random. The board will consider cases put forth by Facebook, but users will also be able to appeal to the board if they meet certain requirements.