When former Facebook product manager Frances Haugen unmasked herself earlier this month as the who leaked company research calling into question its motives, it was clear she had major trust issues with her former employer.
"The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money," she told 60 Minutes.
Haugen's lawyers filed complaints with the US Securities and Exchange Commission, alleging that the world's largest social network misled investors and the public about its role in fueling misinformation, hate speech and human trafficking along with the harms it contributes to teen's mental health. To support the allegations, Haugen copied tens of thousands of pages of internal research before she left the social network in May.
That research became the foundation for a series of stories that were published beginning in September by The Wall Street Journal known as The Facebook Files that highlighted how much the company already knows about the platform's harms. A consortium of 17 American news outlets, including The Associated Press, The Wall Street Journal and The New York Times, then gained access to these documents and published their own stories, which started to roll out Friday. CNET has requested access to these documents, collectively known as .
For years, advocacy groups and even Facebook's own employees have complained about the gaps in how the social network enforces its rules against hate speech, misinformation and other offensive content. Here are some key takeaways from stories published about the Facebook Papers:
Facebook fails to police content in developing countries
News outlets, including The Associated Press, Reuters and The Washington Post, note that Facebook has struggled to effectively police content that fueled hate speech and violence in developing countries including India.
Part of the issue is that Facebook hasn't hired enough content moderators who possess the proper language skills and cultural context.
"The painful reality is that we simply can't cover the entire world with the same level of support," Samidh Chakrabarti, who was Facebook's civic integrity lead, wrote in a 2019 internal post viewed by the Post.
In the Middle East, Facebook has been used to trade and sell maids. The social network has struggled to crack down on human trafficking and Apple even threatened to pull Facebook and Instagram from the app stores two years ago because of those concerns, The Associated Press reported.
Facebook said in a blog post it's been investing more resources including in Myanmar and Ethiopia. The company says there are 40,000 people working on safety and security, including global teams who review content in more than 70 languages.
Employees felt like Facebook didn't do enough ahead of the US 2020 election
While content moderation is worse in developing countries, Facebook's employees also felt like the company didn't do enough to crack down on misinformation ahead of the 2020 US presidential election.
Supporters of Donald Trump, who lost the election to Joe Biden, posted false claims that the election had been stolen. Facebook then suspended Trump from its platform until at least 2023 because of concerns his comments could incite violence following the deadly US Capitol Hill riot in January.
Facebook's own research, according to NBC News, showed the platform was recommending more extremist content -- including material about QAnon -- to users. The documents also showed that Facebook was "unprepared" when it came to curbing the Stop the Steal movement, CNN reported.
Facebook said in a blog post that "responsibility for the insurrection lies with those who broke the law during the attack and those who incited them."
Teens are migrating away from Facebook
The documents also provided some data that showed Facebook is failing to attract teens to its platform.
One researcher shared data earlier this year that showed teenage users in the US dropped by 13% since 2019. Teenage users, a valuable market for Facebook, were also estimated to decline 45% over the next two years, according to The Verge.
Bloomberg also noted that teens are using duplicate accounts and users across all age groups are creating fewer posts.
Lawmakers are currently looking at legislation to safeguard child safety after The Wall Street Journal reported that Facebook's internal research showed the app was "toxic" to teenage girls who struggle with body image issues and suicidal thoughts.