Facebook whistleblower Frances Haugen testifies at UK Parliament
Facebook whistleblower Frances Haugen testifies at UK Parliament
12:01

Facebook whistleblower Frances Haugen testifies at UK Parliament

Politics
Speaker 1: I'm deeply concerned about the false choices that Facebook presents. They routinely try to reduce the discussion to things like you can either have transparency or privacy, which do you wanna have, or you can, uh, if you want safety, you have to have censorship. When reality, they have lots of non-con based choices that would sliver off a half percentage point of growth, a point a percentage point of growth. And Facebook is unwilling to give up those SLIs for our safety. And, um, [00:00:30] and I came forward now because now is the most critical time to act. When we see something like an oil spill that oil spill doesn't make it harder for society to at oil companies. But right now the failures of Facebook are making it harder for us to regulate Speaker 2: Facebook. So on, on those failures, looking at the way the platform is moderated today, do you think it, unless there is change, do you think it makes it more likely that we'll, that we will see events like the interaction in Washington on the 6th of January, [00:01:00] this year more violent hacks that have been driven by Facebook systems? Do you think we, it is more likely we will see more of those events as things stand Speaker 1: Today. I, I, I have no doubt that the cha like the events we're seen around the world, things like Manmar Ethiopia, those are the opening chapters because engagement based ranking does two things. One it prioritizing and amplifies Divis of polarizing stream content, and two, it concentrates it. And so if Facebook comes back and says only a tiny sliver of content on our platform is hate or only a [00:01:30] tiny sliver is violence. One, they can't detect it very well. So I don't know if I trust those numbers, but two, it gets hyper concentrated in, you know, 5% of the population. And you only need 3% of the population on the streets to have a revolution. And that's Speaker 2: Dangerous. I wanna ask you a bit about that, that hyper concentration in particular, an era that you worked on, uh, in particular and that's Facebook groups. I remember being told several years ago by Facebook executive, that the only way you could drive content through the platform is advertising. But then we see that is, that is not true in groups are increasingly used to [00:02:00] shape that experience. We talk a lot about the impact of, um, algorithmic based recommendation tools like newsfeed. To what extent do you think groups are shaping the experience for many people on Facebook groups Speaker 1: Play a huge and critical role in driving the experience on Facebook? Uh, when I worked on civic misinformation, this is like based on recollection. I don't have a document, but I, I believe it was something like 60% of the content in the newsfeed was from groups. I think a thing that's important for this group to know is that Facebook has been trying to extend those sessions, like get you to consume [00:02:30] longer sessions, more content. And the only way they can do that is by multiplying the content that already exists on the platform. And the way they do that is with things like groups. And re-shares. So if I put one post into a half million person group that can go out a million people, and when combined with engagement based ranking, that group might produce 500 a thousand pieces of content a day, but only three get delivered. And if your algorithm is biased towards extreme polarizing divisive content, it's like viral variants. Those giant groups [00:03:00] are producing lots and lots of pieces of content. And only the ones most likely to spread are the ones that go out. Speaker 2: It was reported. I think last year by the wall street journal that, uh, 60% of people that joined Facebook groups that shared extremist content and promoted extremist content did so act did so at Facebook's active recommendations. So this is clearly something Facebook is, was searching. What action is Facebook taking about groups that share extremist content? Speaker 1: Um, I don't know the exact actions that have been taken in the last, you know, six months, uh, year, [00:03:30] um, actions regarding, uh, extremist groups that are recommended actively to users, promoted to users is a thing that Facebook shouldn't be a able to just say, this is a hard problem. We're working on it. They should have to articulate here's our five point plan. And here's the data that would allow you to hold us accountable because Facebook acting in a non-transparent unaccountable way will just lead to more tragedies. Speaker 2: You think that five point planning tests? Speaker 1: Uh, I, I don't know if they have a five point plan or any plan, did they? I don't, I don't know. I, I didn't work on that. Okay. Speaker 2: [00:04:00] But I mean, to what extent should we be considering groups or should a, should a regulator UK regulator, but asking these questions about Facebook groups? I mean, how sign, I mean, from what you are saying, they are a significant driver of engagement. And if engagement is part of the problem, the way Facebook designed it, then groups must be a big part of that too. Groups, Speaker 1: Part of what is dangerous about groups is that, and, you know, we talk about sometimes this idea of, uh, is, is this an individual problem, or is this a societal problem? Uh, one of the things that happens in [00:04:30] aggregate is the algorithms take people who have very mainstream interests and they push them towards extreme interests. You can be someone center left and you'll push to radical left. You can be center, right? You'll get pushed to radical, right? You can be looking for healthy recipes, you'll get pushed to anorexia content. There are examples in Facebook's research of all this. One of the things that happens with groups and with networks of groups is that people see echo chambers that create social norms. So if I'm in a group that has lots of COVID misinformation, and I see over and over [00:05:00] again, that if someone one gives, uh, COVID vaccine, uh, like, uh, things, they encourage people to get vaccinated. They get completely pounced upon. They get torn apart. I learn that certain ideas are acceptable and unacceptable when that context is around hate. Now you see a normalization of hate, a normalization of dehumanizing others. And that's what leads to violent incidents. Speaker 2: I mean, many people would say that groups, particularly large groups, and some of these groups have hundreds of thousands of members in them. Yes. Yeah. Millions. They should be much easier for the platform to [00:05:30] moderate because people are gathering in a, in a common place. Speaker 1: Um, I strongly recommend that above a certain size group, they should be required to provide their own moderators and moderate. Every post. This would match in a, uh, a content and agnostic way regulate the impact of those large groups. Because if that group is actually valuable enough, they will have no trouble recruiting volunteers. But if that group is just, uh, an amplification point, like we see, um, foreign information operations using groups like [00:06:00] this and virality hacking, that's the practice of borrowing viral content from other places to build a group. We see these, these places as being, um, if you wanna launch an advertising campaign with misinformation in it, we at least have a credit card to track you back. If you wanna start a group and invite a thousand people every day, like the limit is I think 2200 people you can invite every day. You can build out that group and your content will land in their newsfeed for a month. And if they [00:06:30] engage with anybody, it'll be considered a follow. And so things like that make them very, very dangerous and they drive outsized impact on the Speaker 2: Platform. So mean of what you say, if, if a, if a bad, bad actor or agency wanted to influence what a group of people on Facebook would see, you probably set up Facebook groups to do that more than you would, um, Facebook pages and run advertising. And Speaker 1: That is definitely a strategy that's currently used by information operations. Another one that's used, which I think is quite dangerous, is you can create a new account and within five minutes go post into a million person group, [00:07:00] right? There's no accountability, there's no traits, right? Uh, you can find a group to target any interest you want to very, very fine green. Even if you removed microtargeting from ads, people would micro target via groups. Yeah. Speaker 2: And again, I mean, what, you know, what do you think the company's strategy years for dealing with this? Because again, there were, there were, uh, changes made to Facebook groups, I think in 20 17, 20 18, um, to create more of a community experience. I think mark Zuckerberg said, which is good for engagement, but it would seem [00:07:30] similar to changes to the way newsfeed works in terms of the content that it, it prefers in favors. This is a, these are reforms. The company have put in place that have been good for engagement, but have been terrible for home. Speaker 1: I think there's a, a, we need to move away from having binary choices. There's a huge continuum of options that exist, uh, and coming in and saying, Hey, groups that are under a thousand people are wonderful. They create community. They create solidarity. They help with people with connect. If you get above a certain size, maybe 10,000 people, like you need to start [00:08:00] Mo moderating that group because that alone orga that naturally rate limits it. And the thing that we need to think about is where do we add selective friction to these systems so that they are safe in every language you don't need Theis to find the bad content Speaker 2: Is in your experience is Facebook testing it systems all the time. Does Facebook experiment with the way it systems work around how you can increase engagement. And obviously, you know, in terms of, uh, content on the newsfeed, we know it experimented around the election time, around the sort of news that should be favored. So, so [00:08:30] how does Facebook work in, in, in experimenting with its tools? Speaker 1: Facebook is continuously running many experiments in parallel on little slices of, of the, of the data that they have on I'm a strong proponent that Facebook should have to publish a feed of all the experiments they're running. They don't have to tell us what the experiment is, just an ID. And even just seeing the results data would allow us to establish patterns of behavior. Because the real thing we're seeing here is Facebook accepting little tiny, uh, additions of harm. Like when they weigh off, [00:09:00] how much harm is worth, how much growth for us right now, we can't benchmark and say, oh, you're running all these experiments. Are you acting in the public good. But if we had that data, we could see patterns of behavior and see whether or not trends are occurring. Okay. Speaker 2: You would in the civic integrity team Facebook. So if you saw something that was concerning you, who would, who would you report to? Speaker 1: This is a huge, uh, a huge weak spot. Um, if I drove a bus in the United States, there would be a phone number in my break room that I could call that would say, did you see something that endangered public safety call this number? [00:09:30] Some, some, some, someone will take you seriously and listen to you in like the department of transportation. When I worked on counter espionage, I saw things where I was concerned about national security, and I had no idea how to escalate those, cuz I didn't have faith in my chain of command at that point, like they had dissolved civic integrity. I didn't see that there, they would take that seriously. And we were told just to accept under resourcing, Speaker 2: But I mean you would theory you would report to your Liman. Would it be then up to them? Whether they chose to escalate that, uh, Speaker 1: [00:10:00] I flagged repeatedly when I worked on civic integrity, that I felt that critical teams were understaffed. And I was told at Facebook, we accomplished, uh, unimaginable things with far, far, far, fewer resources than anyone would think possible. There is, uh, a culture that lionized is kind of a startup ethic that is in my opinion, irresponsible, right? The idea that, you know, the person who can figure out how to move the metric by cutting the most corners is good. And the reality is it doesn't matter. Facebook is spending 14 billion [00:10:30] in safety a year. If they should be spending 25 billion or 35 billion. That's the real question. And right now there's no, there's no incentives internally that if I, if you make noise saying, we need more help. Like people will not, you will not get rallied around for help because everyone is, everyone is underwater Speaker 2: Many organizations that ultimately fail. I think that sort of culture exists. There's no a culture where there's no external audit and people with inside the organization, don't their problems with the people at the top. What do you think people [00:11:00] like mark Zuckerberg know about these things? Speaker 1: I think it's important that all facts are viewed through a lens of interpretation. And there is, uh, a, a pattern across a lot of the people who run the company or senior leaders, which is, this may be the only job they've ever had. Right? Like mark came in when he was 19 and he's still a CEO. There's a lot of other people who are VPs or directors who this is the only job they've ever had. And so there is a lack of, um, you know, the people who have [00:11:30] been promoted were the people who, you know, could focus on the goals they were given and not necessarily the ones that asked questions around public safety. And I think there's a real thing that people are exposed to do. And then they say, look at all the good we're doing. Like, yes, that's true. But like we didn't invent hate. We didn't invent ethnic violence. And that's not the question. The question is what is Facebook doing to amplify or, or expand hate? What is it doing to amplify or expand ethnic violence? Speaker 2: You're right. I mean, Facebook didn't invent hate, but do you think it's making hate Speaker 1: Worse? Unquestionably is making hate worse. [00:12:00] Thank you.

Up Next

What AirPods Rumors Tell Us About Apple's Health Ambitions
240424-omt-next-airpods-v07

Up Next

What AirPods Rumors Tell Us About Apple's Health Ambitions

What is the Fediverse?
240418-fediverse-winged

What is the Fediverse?

The Missing Piece to Apple's Eco-Friendly Mission
240418-site-omt-the-core-problem-of-apples-green-goals-v1.jpg

The Missing Piece to Apple's Eco-Friendly Mission

Boston Dynamics Retires Its HD Atlas Robot
p1022506-00-00-01-20-still001

Boston Dynamics Retires Its HD Atlas Robot

Apple and Disney's Unique Bond: Why Vision Pro Needs the Mouse
240411-site-can-disney-save-the-apple-vision-pro-v1

Apple and Disney's Unique Bond: Why Vision Pro Needs the Mouse

The Ocean Cleanup's System 03 Collects Plastic Pollution at Record Levels
The Ocean Cleanup System 03

The Ocean Cleanup's System 03 Collects Plastic Pollution at Record Levels

Latest iOS 18 Rumor Roundup: New Designs, AI Tricks
240404-yt-omt-ios-18-siri-ai-v06

Latest iOS 18 Rumor Roundup: New Designs, AI Tricks

Apple to Talk AI in June: This WWDC Is a Big Deal
240328-yt-omt-wwdc24-v07

Apple to Talk AI in June: This WWDC Is a Big Deal

What Google Gemini AI on the iPhone Could Look Like
240321-site-apple-and-gemini-ai

What Google Gemini AI on the iPhone Could Look Like

Microsoft Surface Pro 10, Surface Laptop 6 Are Here
240320-site-microsoft-surface-pros-first-look-v2

Microsoft Surface Pro 10, Surface Laptop 6 Are Here

Tech Shows

The Apple Core
apple-core-w

The Apple Core

Alphabet City
alphabet-city-w

Alphabet City

CNET Top 5
cnet-top-5-w

CNET Top 5

The Daily Charge
dc-site-1color-logo.png

The Daily Charge

What the Future
what-the-future-w

What the Future

Tech Today
tech-today-w

Tech Today

Latest News All latest news

Meta Expands Its Mixed Reality Beyond the Quest Headsets Explainer
Meta Quest 2

Meta Expands Its Mixed Reality Beyond the Quest Headsets Explainer

What AirPods Rumors Tell Us About Apple's Health Ambitions
240424-omt-next-airpods-v07

What AirPods Rumors Tell Us About Apple's Health Ambitions

Robosen's Megatron Transformer Is Too Much Fun for an Evil Robot
240419-megatron-v04

Robosen's Megatron Transformer Is Too Much Fun for an Evil Robot

Apple May Give FineWoven Accessories One More Season
finewoven-240424-land-00-00-13-04-still003

Apple May Give FineWoven Accessories One More Season

US vs. TikTok: What Happens Next
240424-yt-tiktok-vs-us-v04

US vs. TikTok: What Happens Next

Battle of the Humanoid Robots: MenteeBot Is Ready
240423-yt-menteebot-ai-robot-v08

Battle of the Humanoid Robots: MenteeBot Is Ready

Most Popular All most popular

First Look at TSA's Self-Screening Tech (in VR!)
innovation

First Look at TSA's Self-Screening Tech (in VR!)

Samsung Galaxy S24 Ultra Review: More AI at a Higher Cost
240123-site-samsung-galaxy-s24-ultra-review-4

Samsung Galaxy S24 Ultra Review: More AI at a Higher Cost

'Circle to Search' Lets Users Google From Any Screen
circlesearchpic

'Circle to Search' Lets Users Google From Any Screen

Asus Put Two 14-inch OLEDs in a Laptop, Unleashes First OLED ROG Gaming Laptop
asus-preces-00-00-25-11-still003

Asus Put Two 14-inch OLEDs in a Laptop, Unleashes First OLED ROG Gaming Laptop

Samsung Galaxy Ring: First Impressions
samsung-galaxy-ring-clean

Samsung Galaxy Ring: First Impressions

Best of Show: The Coolest Gadgets of CES 2024
240111-site-best-of-ces-2024-1

Best of Show: The Coolest Gadgets of CES 2024

Latest Products All latest products

Robosen's Megatron Transformer Is Too Much Fun for an Evil Robot
240419-megatron-v04

Robosen's Megatron Transformer Is Too Much Fun for an Evil Robot

Battle of the Humanoid Robots: MenteeBot Is Ready
240423-yt-menteebot-ai-robot-v08

Battle of the Humanoid Robots: MenteeBot Is Ready

2025 Audi Q6, SQ6 E-Tron: Audi's Newest EV Is Its Most Compelling
cnet-audiq6

2025 Audi Q6, SQ6 E-Tron: Audi's Newest EV Is Its Most Compelling

Hands-On with Ford's Free Tesla Charging Adapter
pic3

Hands-On with Ford's Free Tesla Charging Adapter

Nuro R3 is an Adorable Self-Driving Snack Bar
240320-site-nuro-r3-first-look-v1

Nuro R3 is an Adorable Self-Driving Snack Bar

First Look: The $349 Nothing Phone 2A Aims to Brighten Your Day
240304-site-nothing-phone-2-first-look-v3

First Look: The $349 Nothing Phone 2A Aims to Brighten Your Day

Latest How To All how to videos

Tips and Tricks for the AirPods Pro 2
airpods-pro-2

Tips and Tricks for the AirPods Pro 2

How to Watch the Solar Eclipse Safely From Your Phone
screenshot-2024-04-03-at-15-47-11.png

How to Watch the Solar Eclipse Safely From Your Phone

Windows 11 Tips and Hidden Features
240311-site-windows-11-hidden-tips-and-tricks-v2

Windows 11 Tips and Hidden Features

Vision Pro App Walkthrough -- VisionOS 1.0.3
VisionOS 1.0.3

Vision Pro App Walkthrough -- VisionOS 1.0.3

Tips and Tricks for the Galaxy S24 Ultra
240216-site-galaxy-s24-ultra-tips-and-hidden-features-2

Tips and Tricks for the Galaxy S24 Ultra

TikTok Is Now on the Apple Vision Pro
tiktok-on-vision-pro-clean

TikTok Is Now on the Apple Vision Pro