Speaker 1: Good afternoon, chairman Blumenthal, ranking member, Blackburn, and members of the subcommittee. Thank you for the opportunity to appear before you. My name is Francis Hagan. I used to work at Facebook. I joined Facebook because I think Facebook has the potential to bring out the best in us. But I am here today because I believe Facebook's products, harm children, stoked division, and weaken our democracy. The [00:00:30] company's leadership knows how to make Facebook and Instagram safer, but won't make the necessary changes because they have put their astronomical profits before people congressional action is needed. They won't solve this crisis without your help. Yesterday, we saw Facebook get taken off the internet. I don't know why I went down, but I know that for more than five hours, Facebook wasn't used to deepen divides destabilized democracies and make young girls [00:01:00] and women feel bad about their bodies. It also means that millions of small businesses weren't able to reach potential customers and countless photos of new babies. Weren't joyously celebrated by family and friends around the world. I believe in the potential of Facebook, we can have social media. We enjoy that connects us without tearing our democracy. A apart, our putting our children in danger and sewing, ethnic violence around the world. We [00:01:30] can do better.
Speaker 1: I have worked as a product manager at large tech companies since 2006, including Google, Pinterest, Yelp, and Facebook. My job has largely focused on algorithmic products like Google plus search and recommendation systems. Like the one that powers the Facebook newsfeed, having worked on four different types of social networks. I understand how complex and nuanced these problems are. However, the choices being made inside of [00:02:00] Facebook are disastrous for our children, for our public safety, for our privacy and for our democracy. And that is, is why we must demand. Facebook make changes during my time at Facebook, first working as the lead product manager for civic misinformation. And later on counter espionage, I saw Facebook repeatedly encounter conflicts between its own profits and our safety Facebook consistently resolved these conflicts in favor of its own profits. [00:02:30] The result has been more division, more harm, more lies, more threats, and more combat.
Speaker 1: In some cases, this, this dangerous online talk has led to actual violence that harms and even kills people. This is not simply a matter of certain social media users being angry or unstable or, or about one side being radicalized against the other. It is about Facebook choosing to grow at all costs, becoming an almost trillion dollar company by buying [00:03:00] its profits with our safety. During my time at Facebook, I came to realize a devastating truth. Almost no one outside of Facebook knows what happens inside of Facebook. The company intentionally hides vital information from the public, from the us government and from governments around the world. The documents I have provided to Congress prove that Facebook has repeatedly misled the public about what its own research reveals about the safety of children, the efficacy of its artificial [00:03:30] intelligence systems and its role in spreading divisive and extreme messages.
Speaker 1: I came forward because I believe that every human being deserves the dignity of the truth, truth, the severity of this crisis, demands that we break out of our previous regulatory frames. Facebook wants to trick you into thinking that privacy protections or changes to section two 30 alone will be sufficient while important. These will not get to the core of the issue, which is that no one truly understands [00:04:00] the destructive choices is made by Facebook except Facebook. We can afford nothing less than full transparency. As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable until the incentives change. Facebook will not change left alone. Facebook will continue to make choices that go against the common. Good are common good. When we realized big tobacco [00:04:30] was hiding the harms, it caused the government took action. When we figured out cars were safer with seat belts, the government took action.
Speaker 1: And when our government learned that opioids were taking lives, the government took action. I implore you to do the same here today. Facebook shapes our perception of the world by choosing the information we see, even those who don't use Facebook are impacted by the majority who do a company with such frightening influence over so many people over [00:05:00] their deepest thoughts, feelings and behavior needs real oversight, but Facebook's closed. Design means it has no real oversight. Only Facebook knows how it personalizes your feed for you. At other large tech companies like Google, any independent researcher can download from the internet, the company search results and write papers about what they find and they do. But Facebook hides behind walls that keeps researchers and regulators [00:05:30] from understanding the true dynamics of their system. Facebook will tell you, privacy means they can't give you data. This is not true.
Speaker 1: When tobacco companies claimed that filtered cigarettes were safer for consumers, scientists could independently invalidate these marketing messages and confirm that in fact, they posed a greater threat to human health. The public cannot do the same with Facebook. We are given no other option than to take their marketing messages [00:06:00] on blind faith. Not only does the company hide most of its own data. My disclosure has proved that when Facebook is directly asked questions, as important as how do you impact the health and safety of our children? They mislead. And they, uh, uh, they choose to mislead and misdirect. Facebook has not earned our blind faith, this inability to see into Facebook's actual systems and confirm how they work is communicated [00:06:30] and work as com and confirm that they work as communicated is like the department of transportation, regulating cars by only watching them drive down the highway today. No regulator has a menu of solutions for how to fix Facebook, cuz Facebook didn't want them to know enough about what's causing the problems otherwise they wouldn't. Otherwise there wouldn't have been need for a whistle blower. How is the public supposed to assess if Facebook is resolving conflicts of interest in a way that is aligned with the public? Good. [00:07:00] If the public has no visibility into how Facebook operates, this must change.
Speaker 1: Facebook wants you to believe that the problems we're talking about are unsolvable. They want you to believe in false choices. They want you to believe that you must choose between a Facebook full of divisive and extreme content or one of the most important values. Our country was founded upon free speech that you must choose between public oversight of Facebook's choices [00:07:30] and your personal privacy. That to be able to share fun photos of your kids with old friends, you must also be inundated with anger driven virality. They want you to that. This is just part of the deal I am here today to tell you that's not true. These problems are solvable, a safer free speech, respecting more enjoyable social media is possible, but there is one [00:08:00] thing that I hope everyone takes away from these disclosures. It is that Facebook can change, but is clearly not going to do so on its own. My fear is that without action divisive and extremist behaviors we see today are only the beginning. What we saw in Myanmar and are now seen in Ethiopia are only the opening chapters of a story. So terrifying. No one wants to read the end of it.
Speaker 1: Congress can change the rules that Facebook plays by and stop the many harms [00:08:30] it is now causing. We now know the truth about Facebook's destructive impact. I really appreciate the seriousness, which the members of Congress and the securities and exchange commission are approaching these issues. I came for forward at great personal risk because I believe we still have time to act, but we must act. Now I'm asking you our elective representatives to act. Thank you. Thank you, Ms. Hagan.