X

Facebook resisted changes meant to dial back viral content, report says

Leaked company documents are providing more insight into how Facebook moderates political content.

Queenie Wong Former Senior Writer
Queenie Wong was a senior writer for CNET News, focusing on social media companies including Facebook's parent company Meta, Twitter and TikTok. Before joining CNET, she worked for The Mercury News in San Jose and the Statesman Journal in Salem, Oregon. A native of Southern California, she took her first journalism class in middle school.
Expertise I've been writing about social media since 2015 but have previously covered politics, crime and education. I also have a degree in studio art. Credentials
  • 2022 Eddie award for consumer analysis
Queenie Wong
3 min read
Facebook-logo-phone-eye-4680

Facebook is under more scrutiny because of leaked internal research and documents.

Angela Lang/CNET

Ahead of the 2020 US presidential election, Facebook  executives reportedly resisted efforts to dial back features that help amplify false and inflammatory content because they feared doing so could harm the platform's usage and growth.

The Wall Street Journal, citing leaked internal documents, said Facebook's employees suggested changes that could slow the spread of viral content, such as killing the reshare button or stopping the promotion of reshared content unless it was from a user's friend. A proponent of making these types of changes has been Kang-Xing Jin, who heads Facebook's health initiatives, according to the report. But executives such as John Hegeman, Facebook's head of ads, raised concerns about stifling viral content.  

"If we remove a small percentage of reshares from people's inventory, they decide to come back to Facebook less," Hegeman wrote in internal communications cited by the Journal.

The communications are the latest in a series of leaked internal documents that the Journal says show Facebook has put its profits over the safety of its users. Frances Haugen, who used to work as a Facebook product manager, publicly identified herself as the whistleblower who gathered leaked documents used by the Journal. The findings from these internal documents have revived scrutiny by US and UK lawmakers. Haugen, who already appeared before Congress, is scheduled to testify before the UK Parliament on Monday.

Facebook has repeatedly said its internal research and correspondence is being mischaracterized. "Provocative content has always spread easily among people. It's an issue that cuts across technology, media, politics and all aspects of society, and when it harms people, we strive to take steps to address it on our platform through our products and policies," a Facebook spokesman said in a statement.

The moderation of political content, though, has been a hot-button issue for the company as it tries to balance safety with concerns about hindering free speech. Conservatives have also accused Facebook of intentionally censoring their content, allegations the company denies. 

Facebook's approach to moderating content from groups that it considers dangerous has been described as a game of whack-a-mole by the Journal.

The New York Times, also citing internal documents, reported Friday that Facebook failed to address misinformation and inflammatory content before and after the 2020 US presidential election even though employees had raised red flags about the issue. 

Supporters of Donald Trump, who lost the election to Joe Biden, were posting false claims that the election had been stolen. Facebook has suspended Trump from its platform until at least 2023 because of concerns his comments could incite violence following the deadly US Capitol Hill riot in January.

One Facebook data scientist found that 10 percent of all US views of political content were of posts that alleged the vote was fraudulent, according to the Times. Facebook employees also felt the company could've done more to crack down on misinformation and conspiracy theories. 

A Facebook spokesperson said the company spent more than two years preparing for the 2020 election and that more than 40,000 people now work on safety and security. The company adjusted some of its measures before, during and after the election following more information from law enforcement. "It is wrong to claim that these steps were the reason for January 6th -- the measures we did need remained in place well into February, and some like not recommending new, civic, or political groups remain in place to this day," Facebook said.

In a blog post Friday evening, the head of Facebook's integrity efforts defended the company's actions to protect the elections and outlined the steps taken by the social network. Some of the actions, including limiting the distribution of live video that was predicted to relate to the election, were referred to internally as "break the glass" measures, VP of integrity Guy Rosen said in the post.

The Times story is part of a series expected from an international group of news organizations that also received documents from Haugen, according to The Information. More stories are expected next week, when Facebook reports earnings and holds its Connect conference on artificial and virtual reality.