The critical spotlight on Facebook intensified over the weekend and into Monday, as based on the cache of internal company documents leaked by former Facebook employee Frances Haugen.
The Washington Post on Friday reported on concern among Facebook employees about the role the site that helped fuel the deadly Jan. 6 storming of the US Capitol. On Saturday, both The New York Times and The Wall Street Journal published stories about misinformation and on Facebook services in India, the company's largest market.
The Post's report followed stories on Friday by Bloomberg and NBC News that also focused on the spread of misinformation on Facebook in the US, and those reports came on top of similar stories in the Journal and the Times on Friday.
Monday brought another wave of stories, from a wider range of outlets including The Associated Press, The Atlantic. CNBC, CNN, Politico, The Verge and Wired. Also on Monday, Haugen addressed the British Parliament. Separately, a UK charity reported Monday that police recorded since 2017 and implored the company to disclose its internal research on child abuse incidents.
In a broad sense, the issues have to do with whether Facebook can be relied on to responsibly balance business motives with social concerns and do away with the flood of dangerous content that has spread on its various social-networking platforms. The company's algorithms drive user engagement, but they can also create problems when it comes to misinformation, hate speech and the like. Matters are complicated by the need to respect free speech while cracking down on problematic posts.
Critics say Facebook has already dropped the ball too many times when it comes to policing its platforms and that the company puts profits ahead of people. Inon Oct. 5 that Facebook's products " , stoke division and weaken our democracy."
Facebook, on the other hand, has said that internal documents are being misrepresented and that a "Mark Zuckerberg wrote in an email to employees earlier this month. "We care deeply about issues like safety, well-being and ."" is being painted of the social-networking giant. "I'm sure many of you have found the recent coverage hard to read because it just doesn't reflect the company we know," CEO
The flurry of new reports based on documents leaked by Haugen follows an earlier investigation in the Journal that relied on that same cache of information. The new stories also come as lawmakers in the US and elsewhere wrestle with whether to , and if so, how.
Facebook didn't immediately respond to a request for comment on the new batch of reports based on documents leaked by Haugen. In a Friday blog post, the head of Facebook's integrity efforts defended the company's actions to protect the 2020 US presidential elections and outlined the steps taken by the social network.
In its story about the social network and India, the Times reports that in February 2019, a Facebook researcher opened a new user account in Kerala, India, to get an idea of what site users there would see. The researcher followed the recommendations generated by the social network's algorithms to watch videos, check out new pages and join groups on Facebook. "The test user's News Feed has become a near constant barrage of polarizing nationalist content, misinformation, violence and gore," an internal Facebook report said later that month, according to the Times.
That echoes the findings of a similar 2019 project conducted by a Facebook researcher in the US, who set up a test account for "Carol Smith," a fictitious "conservative mom" in North Carolina. In two days, NBC News reported, the social network was recommending that she join groups dedicated to . According to NBC, the experiment was outlined in an internal Facebook report called "Carol's Journey to QAnon," a document also referenced by the Times, the Journal and the Post.
"The body of research consistently found Facebook pushed some users into 'rabbit holes,' increasingly narrow echo chambers where violent conspiracy theories thrived," the NBC News report reads. "People radicalized through these rabbit holes make up a small slice of total users, but at Facebook's scale, that can mean millions of individuals."
In regard to the Times' report about India, a Facebook spokesman told the news outlet that the social network had put significant resources into technology designed to root out hate speech in various languages, including Hindi and Bengali, and that this year, Facebook had halved the amount of hate speech that users see worldwide.
As for the "Carol's Journey to QAnon" report, a Facebook spokesperson told NBC News that the document points to the company's efforts to solve problems around dangerous content. "While this was a study of one hypothetical user, it is a perfect example of research the company does to improve our systems and helped inform our decision to remove QAnon from the platform," the spokesperson told the news outlet.