X

Facebook steps up efforts to moderate content and combat fake news

Facebook wants to create an independent board to review content that gets pulled down.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
4 min read
Facebook likes logo are seen on an android mobile phone
Getty Images

Facebook , the world's largest social network, has a long list of challenges. It ranges from election meddling and fake news to allegations of data misuse and censorship. 

Now, the tech giant firm is trying to show both lawmakers and its nearly 2.3 billion users that it's serious about tackling these problems. 

On Monday, Facebook revealed more details about an independent board it's forming that would be able to reverse the company's own decisions about what content it leaves up or pulls down. The company is soliciting feedback from think tanks, researchers and other groups as to how the board could be structured. Facebook is specifically seeking ideas as to how the board could work, what types of cases it might hear and how it could handle cultural differences. 

The board might include 40 members who will each serve for three years, though the company could expand or reduce the size and length of the terms.

Facebook also outlined the ways it's doing more to combat misinformation and election meddling ahead of the European Parliament election in May. The efforts include the rollout of a tool that tracks political ads globally and the establishment of "regional operations centers" in Facebook's Dublin and Singapore offices to thwart fake news, hate speech and voter suppression before elections. 

The moves come as Facebook attempts to rebuild user trust after a series of scandals, including the revelation that Cambridge Analytica, a UK political consultancy, harvested data on as many as 87 million users without their permission. Facebook executives, including CEO Mark Zuckerberg and COO Sheryl Sandberg, have been hauled in front of Congress to testify on privacy issues. Facing calls for more regulation, the company is also under pressure to show that its efforts to combat fake news and election meddling are actually working. 

"On elections, I'm in no doubt that we have a lot of work to do to demonstrate that Facebook tools can provide a positive contribution to the quality of our democracy," Nick Clegg, Facebook's new head of global affairs and communications, said in his first public speech at an event in Brussels. "But much of the skepticism that Facebook faces as a company and as an industry is about something more fundamental: the role of personal data in the internet economy."

Clegg is a former deputy prime minister of the UK.

Facebook also said it's releasing a political ads database in Europe, India, Ukraine and Israel ahead of elections in those countries before expanding the tool worldwide by the end of June. Last year, Facebook started requiring US advertisers who posted ads about politics or "issues of national importance" to verify their identity and location. Facebook is expanding this rule to other countries. The ads must include a "paid for by" disclaimer, and they're stored in a public database for up to seven years.

The launch of the political ads database in the US wasn't without glitches. Some users raised concerns their ads were being mischaracterized as political while news outlets such as Vice and Business Insider found loopholes in the system that allowed them to get approval for fake ads paid for the Islamic State and all 100 US senators.

Meanwhile, Facebook has been under fire in the past for what content it decides to pull down or keep up. It's removed and then restored an iconic Vietnam War photo of a naked 9-year-old girl fleeing after a napalm attack and denied allegations that it's suppressing conservative speech.

"I think it's really important that Facebook does not becoming an arbiter of what is politically accepted speech beyond the obvious cases of hate speech and illegal speech," Clegg said on Monday.

The company is turning to outside experts in privacy, journalism, civil rights and other topics to weigh in on these contentious decisions as part of a new board tasked with evaluating content. The board will review content Facebook leaves up or pulls down and be empowered to reverse the company's decisions. 

Facebook still faces questions about how an oversight board for content decisions will work and what cases it will hear. 

Among the challenges are how the board will ensure cultural sensitivity as it makes decisions that affect Facebook's 2.3 billion members around the world and how the board members can be chosen fairly and in a transparent manner.

Facebook users who disagree with the company's decision or the tech firm itself might refer a case to the new board.

The company is hosting workshops in Singapore, Delhi, Nairobi, Berlin, New York, Mexico City and other cities over the next six months to help get these questions answered.

"As we build out the board," Clegg wrote in a blog post. "we want to make sure it is able to render independent judgment, is transparent and respects privacy."

CES 2019: See all of CNET's coverage of the year's biggest tech show.

Everything about Fortnite: What you need to know about the hit game.