X

Facebook whistleblower reveals herself, says company prioritizes its own interests

A former product engineer at the social network says she was disappointed that safety systems put in place for the 2020 election were temporary.

Steven Musil Night Editor / News
Steven Musil is the night news editor at CNET News. He's been hooked on tech since learning BASIC in the late '70s. When not cleaning up after his daughter and son, Steven can be found pedaling around the San Francisco Bay Area. Before joining CNET in 2000, Steven spent 10 years at various Bay Area newspapers.
Expertise I have more than 30 years' experience in journalism in the heart of the Silicon Valley.
Steven Musil
4 min read
frances-haugen-facebook

Frances Haugen, the whistleblower who leaked internal Facebook documents to The Wall Street Journal.

60 Minutes

The person behind a leak of internal Facebook research that served as the basis for a series of news reports about the harm the social network's platforms cause, publicly revealed herself on 60 Minutes on Sunday. She is a former algorithmic product manager at Facebook named Frances Haugen.

Haugen, who worked at Facebook for about two years, told 60 Minutes she leaked the documents to the Wall Street Journal after seeing a conflict of interest at Facebook between what's good for the company and what's good for the public.

"Facebook, over and over again, chose to optimize for its own interests, like making more money," she told 60 Minutes' Scott Pelley in an interview.

"I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground," the 37-year-old data scientist said.

The Wall Street Journal's series of stories on the documents found,  among other things, that the company ignored research about how Instagram can harm teen girls and that it performed an algorithm change to improve interaction on the platform that actually made users "angrier."

Haugen explained how the algorithm has "thousands of options" for what it could show you in your feed based on what you've engaged with in the past.

"One of the consequences of how Facebook is picking out that content today is it is -- optimizing for content that gets engagement, or reaction," she said. "But its own research is showing that content that is hateful, that is divisive, that is polarizing, it's easier to inspire people to anger than it is to other emotions."

During last year's elections, Haugen said she was assigned to Facebook's Civic Integrity project, which worked to identify and reduce risks to elections including misinformation. She said the company knew the dangers associated with the 2020 election, but that the company's response was temporary. She said employees were told the unit was being dissolved because the election had ended without riots.

"Fast forward a couple months, we got the insurrection," she said. "And when they got rid of Civic Integrity, it was the moment where I was like, 'I don't trust that they're willing to actually invest what needs to be invested to keep Facebook from being dangerous.'

"And as soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety," she said. "And that really feels like a betrayal of democracy to me."

Facebook didn't immediately respond to a request for comment to Haugen's appearance on 60 Minutes. However, The New York Times reported earlier in the weekend that Nick Clegg, Facebook's head of policy and global affairs, sent a 1,500-word memo to employees ahead of the news magazine's segment.

"Social media has had a big impact on society in recent years, and Facebook is often a place where much of this debate plays out," he wrote, according to The Times. "But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarization."

Watch this: Congress grills Facebook, Nintendo denies plans for 4K Switch

Haugen's appearance on 60 Minutes came just days after a Senate subcommittee held a hearing about Facebook's and Instagram's harmful mental health impact on young people, including teenagers. US lawmakers are seeking more answers from the social media giant after The Wall Street Journal published a series of stories about the company's knowledge of the platform's problems even as it downplayed them publicly. One in three teen girls reported that Instagram made their body issues worse, according to a 2019 presentation cited by the Journal. 

During the hearing, Facebook's global head of safety, Antigone Davis, pushed back on the news outlet's characterization of its internal research. "I want to be clear that this research is not a bombshell," Davis said. "It's not causal research." 

Instagram, owned by Facebook, is pausing the development of a kids version of the app. The social network also released some of its internal research and said it's looking at ways to release more data.

Davis' remarks didn't appear to appease lawmakers who are planning to hold more hearings on the issue. Haugen is scheduled to testify before the Senate subcommittee on consumer protection on Tuesday. During the 60 Minutes interview, she suggested the federal government should impose regulations.

"Facebook has demonstrated they cannot act independently," she said. "Facebook, over and over again, has shown it chooses profit over safety. It is subsidizing, it is paying for its profits with our safety."

CNET's Queenie Wong and Andrew Morse contributed to this report.