X

Facebook looks to the world for help fixing its content mess

In a new report, the social network suggests people around the globe want its content oversight board to have more power than originally anticipated.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
6 min read
In this photo illustration a Facebook logo seen displayed on

Facebook released a 36-page report summarizing the feedback it's received about the creation of a content oversight board. 

Photo Illustration by Omar Marques/SOPA Images/LightRocket via Getty Images

Noah Feldman, a professor at Harvard Law School, spends a lot of time pondering the impact social media has on free speech. A constitutional law scholar, he often considers how power imbalances can stifle some voices.

Not long ago, Feldman wondered whether Facebook could take a page from the legal system. The world's biggest social network, he thought, could set up an independent board to oversee content decisions, a function that would mirror the role of the Supreme Court. Last year, he laid out his thoughts in a 1,200-word essay and emailed it to Facebook COO Sheryl Sandberg .

"When advocates pressure Facebook or Google or Twitter to ban certain speech, they have a good chance of getting through to the companies," Feldman wrote in a pitch that proved successful. "No one is pushing hard on the other side of the door."

On Thursday, Feldman's vision moved closer to reality when Facebook published a 44-page report summarizing what the company learned from speaking with roughly 900 people and reviewing more than 1,200 public comments about the idea. The report suggests that Facebook, which released a draft charter for the board in January, could give the body the power to influence its policies rather than just the ability to review the social network's toughest content decisions. That could ultimately affect both everyday and high-profile Facebook users who appeal to the social network when their posts get removed.

"A strong consensus emerged that the Board's decisions should influence Facebook's policy development," according to the report. "Without some policy influence, the Board would not be seen as valuable or legitimate."

The study comes as the social media giant faces criticism from all sides about what content it leaves up or pulls down. Conservative commentators and politicians have said the social network is biased against right-wing views, citing the bannings of provocateurs Alex Jones and Milo Yiannopoulos. Progressive groups say it's become a swamp of racist, sexist and misleading speech.

Facebook knows that the new board won't solve all of the social network's woes. It isn't designed to review the social network's News Feed ranking or artificial intelligence, according to the report.

Mark Zuckerberg , Facebook's CEO and co-founder, said Thursday in a discussion with Feldman and Jenny Martinez, the dean of Stanford Law School, that he's wary about giving the board too much responsibility. 

"This is such an ambitious and unusual project in general for a company to take on, that one of the things that I've tried to be careful about is making sure the scope is clear in the beginning, so that way it doesn't collapse under its own weight," Zuckerberg said.

Global input

To get feedback, Facebook held workshops in Singapore, Delhi, Nairobi, Berlin, New York and Mexico City, along with 22 roundtables. More than 650 people from nearly 90 countries participated, including academics, legal scholars, members of international think tanks, free speech advocates and journalists.

During the workshops, participants were shown examples of content the social network has weighed in on and asked how they would deal with it as a board. In one case, Facebook grappled with whether to remove a live video filed by a candidate for office who made disparaging remarks to someone about their gender identity. Weighing newsworthiness against safety, Facebook decided to remove the video, according to the report. 

Noah Feldman

Harvard Law School Professor Noah Feldman came up with the idea of a Facebook content oversight board.

Photo by Rick Friedman/Corbis via Getty Images

Many participants, though, questioned the point of forming a content oversight board if it didn't have the power to recommend changes to Facebook's policies.

"It was a reaffirmation of what people wanted to see out of an oversight board," said Zoe Darme, Facebook's manager of global affairs and governance. "They want something that actually has teeth."

An oversight board could bolster the legitimacy of Facebook's community standards, which have been criticized for being difficult to decipher. Facebook has rules that bar its users from posting hate speech, child nudity and other offensive content. If the board is given power to mold those standards, which Facebook suggests in the report, it might help the social network develop a more consistent interpretation of its rules, which can seem arbitrary and have sometimes sparked public outrage.

In one of its most controversial decisions, Facebook in 2016 pulled down an iconic Vietnam War photo of a girl fleeing a napalm attack. The company defended the removal, saying the image violated its rules on child nudity, only to reverse its decision in the face of widespread criticism because of the image's historical importance.

Some of Facebook's content moderation decisions have high stakes. UN investigators have found that Facebook played a role in spreading hate speech that fueled ethnic cleansing in Myanmar. More recently, Facebook faced criticism after it decided to not remove a doctored video of House Speaker Nancy Pelosi that made her seem drunk.

From deciding what cases should be heard to the makeup of the board and whether it should be a full-time job, Facebook still has a lot of details to sort out. Other social media companies, including Twitter and Google-owned YouTube, haven't proposed setting up a board for content moderation. 

Huge company, huge task

Creating a board to handle content appeals is extremely tough, experts say, because Facebook is a global company that deals with different laws in countries that will likely have different views on what cases the board should prioritize. From January to March, Facebook received nearly 25 million appeals for content it pulled down for violating its rules, according to a report released in May. The board's draft charter suggests setting up a board with 40 members who serve part-time for three years. It would hear cases requested by Facebook or users who appeal when their posts are removed. 

As Facebook tries to create an independent board, some critics, such as conservative UK lawmaker Damian Collins, have accused the social network of trying to "pass on the responsibility." That may be difficult for the social network to shake.

"I think everybody agrees there should be greater outside oversight of Facebook. But how you accomplish that is really a tricky question," said Nate Persily, a professor at Stanford law school who taught a course in which students tried to craft a content oversight board and presented their findings to Facebook.

Watch this: Senators grill Twitter and Facebook over alleged political bias

Jillian York, the Electronic Frontier Foundation's director for international freedom of expression, said Facebook could have a hard time finding members to fill the board even if it's a full-time job because it would require people to leave their jobs or move from another country for a short-term position. York attended one of Facebook's workshops in Berlin this month. 

"I have a strong understanding of the challenges that this oversight board will face coming out of this meeting," she said. "I'm not sure that Facebook is ready for all of those challenges, however."

Color of Change Campaign Director Evan Feeney, who went to one of Facebook's workshops in New York, said he walked out with more questions than answers, including about the board's role.

"If it's just individual cases, that's not a high-value impact," he said. "I think the board has to lean more towards reviewing content where maybe the policy isn't clear."

Facebook said the final charter for the oversight board will be released in August. Selection of board members is under way and will continue through later this year.

Proving independence

One of Facebook's biggest challenges as it moves forward with creating a board is whether the company can convince the public it's independent from the social network. Facebook could appoint a selection committee to then decide on board members and set up an independent trust to fund the board members, workshop participants suggested to the company. 

With a limited number of people who get to serve on the board, there also might be concerns about whether a board member's political views are impacting decisions, experts say. 

"If people aren't buying that the board is a legitimate body to be articulating a view on something then it will not be very useful to Facebook or its users," said Emma Llansó, the director of the Center for Democracy & Technology's free expression project. 

Ultimately, the board might have to just show through its actions that it isn't doing Facebook's bidding.

"You will convince the world that this board is independent, when it makes decisions that don't exactly match what Facebook would like them to be," Feldman said. 

Originally published June 27 at 6 a.m. PT
Update 6:21 a.m. PT: Adds statements from the report.
Update 9:14 a.m. PT: Adds quote from Mark Zuckerberg and a content moderation example from the report.

Smart displays let Amazon, Facebook, Google show you answers to your questions

See all photos