Facebook Papers produce flurry of reports on social network's internal chaos
Several major news outlets publish reports focused on the social network's struggle to contain dangerous content.
Edward MoyerSenior Editor
Edward Moyer is a senior editor at CNET and a many-year veteran of the writing and editing world. He enjoys taking sentences apart and putting them back together. He also likes making them from scratch. ¶ For nearly a quarter of a century, he's edited and written stories about various aspects of the technology world, from the US National Security Agency's controversial spying techniques to historic NASA space missions to 3D-printed works of fine art. Before that, he wrote about movies, musicians, artists and subcultures.
Ed was a member of the CNET crew that won a National Magazine Award from the American Society of Magazine Editors for general excellence online. He's also edited pieces that've nabbed prizes from the Society of Professional Journalists and others.
The Washington Post on Friday reported on concern among Facebook employees about the role the site played in the spread of misinformation that helped fuel the deadly Jan. 6 storming of the US Capitol. On Saturday, both The New York Times and The Wall Street Journal published stories about misinformation and hate speech on Facebook services in India, the company's largest market.
The Post's report followed stories on Friday by Bloomberg and NBC News that also focused on the spread of misinformation on Facebook in the US, and those reports came on top of similar stories in the Journal and the Times on Friday.
In a broad sense, the issues have to do with whether Facebook can be relied on to responsibly balance business motives with social concerns and do away with the flood of dangerous content that has spread on its various social-networking platforms. The company's algorithms drive user engagement, but they can also create problems when it comes to misinformation, hate speech and the like. Matters are complicated by the need to respect free speech while cracking down on problematic posts.
Facebook, on the other hand, has said that internal documents are being misrepresented and that a "false picture" is being painted of the social-networking giant. "I'm sure many of you have found the recent coverage hard to read because it just doesn't reflect the company we know," CEO
wrote in an email to employees earlier this month. "We care deeply about issues like safety, well-being and mental health."
Facebook didn't immediately respond to a request for comment on the new batch of reports based on documents leaked by Haugen. In a Friday blog post, the head of Facebook's integrity efforts defended the company's actions to protect the 2020 US presidential elections and outlined the steps taken by the social network.
In its story about the social network and India, the Times reports that in February 2019, a Facebook researcher opened a new user account in Kerala, India, to get an idea of what site users there would see. The researcher followed the recommendations generated by the social network's algorithms to watch videos, check out new pages and join groups on Facebook. "The test user's News Feed has become a near constant barrage of polarizing nationalist content, misinformation, violence and gore," an internal Facebook report said later that month, according to the Times.
That echoes the findings of a similar 2019 project conducted by a Facebook researcher in the US, who set up a test account for "Carol Smith," a fictitious "conservative mom" in North Carolina. In two days, NBC News reported, the social network was recommending that she join groups dedicated to the bogus QAnon conspiracy theory. According to NBC, the experiment was outlined in an internal Facebook report called "Carol's Journey to QAnon," a document also referenced by the Times, the Journal and the Post.
"The body of research consistently found Facebook pushed some users into 'rabbit holes,' increasingly narrow echo chambers where violent conspiracy theories thrived," the NBC News report reads. "People radicalized through these rabbit holes make up a small slice of total users, but at Facebook's scale, that can mean millions of individuals."
In regard to the Times' report about India, a Facebook spokesman told the news outlet that the social network had put significant resources into technology designed to root out hate speech in various languages, including Hindi and Bengali, and that this year, Facebook had halved the amount of hate speech that users see worldwide.
As for the "Carol's Journey to QAnon" report, a Facebook spokesperson told NBC News that the document points to the company's efforts to solve problems around dangerous content. "While this was a study of one hypothetical user, it is a perfect example of research the company does to improve our systems and helped inform our decision to remove QAnon from the platform," the spokesperson told the news outlet.
Watch this: Facebook whistleblower Frances Haugen testifies at UK Parliament