Facebook whistleblower Frances Haugen testifies at UK Parliament
Facebook whistleblower Frances Haugen testifies at UK Parliament
12:01

Facebook whistleblower Frances Haugen testifies at UK Parliament

Politics
Speaker 1: I'm deeply concerned about the false choices that Facebook presents. They routinely try to reduce the discussion to things like you can either have transparency or privacy, which do you wanna have, or you can, uh, if you want safety, you have to have censorship. When reality, they have lots of non-con based choices that would sliver off a half percentage point of growth, a point a percentage point of growth. And Facebook is unwilling to give up those SLIs for our safety. And, um, [00:00:30] and I came forward now because now is the most critical time to act. When we see something like an oil spill that oil spill doesn't make it harder for society to at oil companies. But right now the failures of Facebook are making it harder for us to regulate Speaker 2: Facebook. So on, on those failures, looking at the way the platform is moderated today, do you think it, unless there is change, do you think it makes it more likely that we'll, that we will see events like the interaction in Washington on the 6th of January, [00:01:00] this year more violent hacks that have been driven by Facebook systems? Do you think we, it is more likely we will see more of those events as things stand Speaker 1: Today. I, I, I have no doubt that the cha like the events we're seen around the world, things like Manmar Ethiopia, those are the opening chapters because engagement based ranking does two things. One it prioritizing and amplifies Divis of polarizing stream content, and two, it concentrates it. And so if Facebook comes back and says only a tiny sliver of content on our platform is hate or only a [00:01:30] tiny sliver is violence. One, they can't detect it very well. So I don't know if I trust those numbers, but two, it gets hyper concentrated in, you know, 5% of the population. And you only need 3% of the population on the streets to have a revolution. And that's Speaker 2: Dangerous. I wanna ask you a bit about that, that hyper concentration in particular, an era that you worked on, uh, in particular and that's Facebook groups. I remember being told several years ago by Facebook executive, that the only way you could drive content through the platform is advertising. But then we see that is, that is not true in groups are increasingly used to [00:02:00] shape that experience. We talk a lot about the impact of, um, algorithmic based recommendation tools like newsfeed. To what extent do you think groups are shaping the experience for many people on Facebook groups Speaker 1: Play a huge and critical role in driving the experience on Facebook? Uh, when I worked on civic misinformation, this is like based on recollection. I don't have a document, but I, I believe it was something like 60% of the content in the newsfeed was from groups. I think a thing that's important for this group to know is that Facebook has been trying to extend those sessions, like get you to consume [00:02:30] longer sessions, more content. And the only way they can do that is by multiplying the content that already exists on the platform. And the way they do that is with things like groups. And re-shares. So if I put one post into a half million person group that can go out a million people, and when combined with engagement based ranking, that group might produce 500 a thousand pieces of content a day, but only three get delivered. And if your algorithm is biased towards extreme polarizing divisive content, it's like viral variants. Those giant groups [00:03:00] are producing lots and lots of pieces of content. And only the ones most likely to spread are the ones that go out. Speaker 2: It was reported. I think last year by the wall street journal that, uh, 60% of people that joined Facebook groups that shared extremist content and promoted extremist content did so act did so at Facebook's active recommendations. So this is clearly something Facebook is, was searching. What action is Facebook taking about groups that share extremist content? Speaker 1: Um, I don't know the exact actions that have been taken in the last, you know, six months, uh, year, [00:03:30] um, actions regarding, uh, extremist groups that are recommended actively to users, promoted to users is a thing that Facebook shouldn't be a able to just say, this is a hard problem. We're working on it. They should have to articulate here's our five point plan. And here's the data that would allow you to hold us accountable because Facebook acting in a non-transparent unaccountable way will just lead to more tragedies. Speaker 2: You think that five point planning tests? Speaker 1: Uh, I, I don't know if they have a five point plan or any plan, did they? I don't, I don't know. I, I didn't work on that. Okay. Speaker 2: [00:04:00] But I mean, to what extent should we be considering groups or should a, should a regulator UK regulator, but asking these questions about Facebook groups? I mean, how sign, I mean, from what you are saying, they are a significant driver of engagement. And if engagement is part of the problem, the way Facebook designed it, then groups must be a big part of that too. Groups, Speaker 1: Part of what is dangerous about groups is that, and, you know, we talk about sometimes this idea of, uh, is, is this an individual problem, or is this a societal problem? Uh, one of the things that happens in [00:04:30] aggregate is the algorithms take people who have very mainstream interests and they push them towards extreme interests. You can be someone center left and you'll push to radical left. You can be center, right? You'll get pushed to radical, right? You can be looking for healthy recipes, you'll get pushed to anorexia content. There are examples in Facebook's research of all this. One of the things that happens with groups and with networks of groups is that people see echo chambers that create social norms. So if I'm in a group that has lots of COVID misinformation, and I see over and over [00:05:00] again, that if someone one gives, uh, COVID vaccine, uh, like, uh, things, they encourage people to get vaccinated. They get completely pounced upon. They get torn apart. I learn that certain ideas are acceptable and unacceptable when that context is around hate. Now you see a normalization of hate, a normalization of dehumanizing others. And that's what leads to violent incidents. Speaker 2: I mean, many people would say that groups, particularly large groups, and some of these groups have hundreds of thousands of members in them. Yes. Yeah. Millions. They should be much easier for the platform to [00:05:30] moderate because people are gathering in a, in a common place. Speaker 1: Um, I strongly recommend that above a certain size group, they should be required to provide their own moderators and moderate. Every post. This would match in a, uh, a content and agnostic way regulate the impact of those large groups. Because if that group is actually valuable enough, they will have no trouble recruiting volunteers. But if that group is just, uh, an amplification point, like we see, um, foreign information operations using groups like [00:06:00] this and virality hacking, that's the practice of borrowing viral content from other places to build a group. We see these, these places as being, um, if you wanna launch an advertising campaign with misinformation in it, we at least have a credit card to track you back. If you wanna start a group and invite a thousand people every day, like the limit is I think 2200 people you can invite every day. You can build out that group and your content will land in their newsfeed for a month. And if they [00:06:30] engage with anybody, it'll be considered a follow. And so things like that make them very, very dangerous and they drive outsized impact on the Speaker 2: Platform. So mean of what you say, if, if a, if a bad, bad actor or agency wanted to influence what a group of people on Facebook would see, you probably set up Facebook groups to do that more than you would, um, Facebook pages and run advertising. And Speaker 1: That is definitely a strategy that's currently used by information operations. Another one that's used, which I think is quite dangerous, is you can create a new account and within five minutes go post into a million person group, [00:07:00] right? There's no accountability, there's no traits, right? Uh, you can find a group to target any interest you want to very, very fine green. Even if you removed microtargeting from ads, people would micro target via groups. Yeah. Speaker 2: And again, I mean, what, you know, what do you think the company's strategy years for dealing with this? Because again, there were, there were, uh, changes made to Facebook groups, I think in 20 17, 20 18, um, to create more of a community experience. I think mark Zuckerberg said, which is good for engagement, but it would seem [00:07:30] similar to changes to the way newsfeed works in terms of the content that it, it prefers in favors. This is a, these are reforms. The company have put in place that have been good for engagement, but have been terrible for home. Speaker 1: I think there's a, a, we need to move away from having binary choices. There's a huge continuum of options that exist, uh, and coming in and saying, Hey, groups that are under a thousand people are wonderful. They create community. They create solidarity. They help with people with connect. If you get above a certain size, maybe 10,000 people, like you need to start [00:08:00] Mo moderating that group because that alone orga that naturally rate limits it. And the thing that we need to think about is where do we add selective friction to these systems so that they are safe in every language you don't need Theis to find the bad content Speaker 2: Is in your experience is Facebook testing it systems all the time. Does Facebook experiment with the way it systems work around how you can increase engagement. And obviously, you know, in terms of, uh, content on the newsfeed, we know it experimented around the election time, around the sort of news that should be favored. So, so [00:08:30] how does Facebook work in, in, in experimenting with its tools? Speaker 1: Facebook is continuously running many experiments in parallel on little slices of, of the, of the data that they have on I'm a strong proponent that Facebook should have to publish a feed of all the experiments they're running. They don't have to tell us what the experiment is, just an ID. And even just seeing the results data would allow us to establish patterns of behavior. Because the real thing we're seeing here is Facebook accepting little tiny, uh, additions of harm. Like when they weigh off, [00:09:00] how much harm is worth, how much growth for us right now, we can't benchmark and say, oh, you're running all these experiments. Are you acting in the public good. But if we had that data, we could see patterns of behavior and see whether or not trends are occurring. Okay. Speaker 2: You would in the civic integrity team Facebook. So if you saw something that was concerning you, who would, who would you report to? Speaker 1: This is a huge, uh, a huge weak spot. Um, if I drove a bus in the United States, there would be a phone number in my break room that I could call that would say, did you see something that endangered public safety call this number? [00:09:30] Some, some, some, someone will take you seriously and listen to you in like the department of transportation. When I worked on counter espionage, I saw things where I was concerned about national security, and I had no idea how to escalate those, cuz I didn't have faith in my chain of command at that point, like they had dissolved civic integrity. I didn't see that there, they would take that seriously. And we were told just to accept under resourcing, Speaker 2: But I mean you would theory you would report to your Liman. Would it be then up to them? Whether they chose to escalate that, uh, Speaker 1: [00:10:00] I flagged repeatedly when I worked on civic integrity, that I felt that critical teams were understaffed. And I was told at Facebook, we accomplished, uh, unimaginable things with far, far, far, fewer resources than anyone would think possible. There is, uh, a culture that lionized is kind of a startup ethic that is in my opinion, irresponsible, right? The idea that, you know, the person who can figure out how to move the metric by cutting the most corners is good. And the reality is it doesn't matter. Facebook is spending 14 billion [00:10:30] in safety a year. If they should be spending 25 billion or 35 billion. That's the real question. And right now there's no, there's no incentives internally that if I, if you make noise saying, we need more help. Like people will not, you will not get rallied around for help because everyone is, everyone is underwater Speaker 2: Many organizations that ultimately fail. I think that sort of culture exists. There's no a culture where there's no external audit and people with inside the organization, don't their problems with the people at the top. What do you think people [00:11:00] like mark Zuckerberg know about these things? Speaker 1: I think it's important that all facts are viewed through a lens of interpretation. And there is, uh, a, a pattern across a lot of the people who run the company or senior leaders, which is, this may be the only job they've ever had. Right? Like mark came in when he was 19 and he's still a CEO. There's a lot of other people who are VPs or directors who this is the only job they've ever had. And so there is a lack of, um, you know, the people who have [00:11:30] been promoted were the people who, you know, could focus on the goals they were given and not necessarily the ones that asked questions around public safety. And I think there's a real thing that people are exposed to do. And then they say, look at all the good we're doing. Like, yes, that's true. But like we didn't invent hate. We didn't invent ethnic violence. And that's not the question. The question is what is Facebook doing to amplify or, or expand hate? What is it doing to amplify or expand ethnic violence? Speaker 2: You're right. I mean, Facebook didn't invent hate, but do you think it's making hate Speaker 1: Worse? Unquestionably is making hate worse. [00:12:00] Thank you.

Up Next

OnePlus Unveils OnePlus Pad
oneplus-image

Up Next

OnePlus Unveils OnePlus Pad

Apple's Mysterious New Music App
230201-yt-apple-new-music-app-v05

Apple's Mysterious New Music App

Boston Dynamics' Stretch Robot Is DHL's Newest Helper
dhl-image-without-bug

Boston Dynamics' Stretch Robot Is DHL's Newest Helper

Apple HomePod 2 Gives Great Sound, but Is It Too Expensive?
homepod-review-00-04-41-23-still113

Apple HomePod 2 Gives Great Sound, but Is It Too Expensive?

How Apple Gets Us to Care About VR
cnetstillvr

How Apple Gets Us to Care About VR

DOJ Stops Hive Ransomware Network
hive-cms-image

DOJ Stops Hive Ransomware Network

New Apple HomePod and MacBooks: What We Did (and Didn't) Get
homepodblue

New Apple HomePod and MacBooks: What We Did (and Didn't) Get

Why Streaming Is Getting More Expensive
streaming-services-seq-00-09-03-22-still001

Why Streaming Is Getting More Expensive

NASA, Boeing Reveal a New, More Sustainable Jet
nasa-image

NASA, Boeing Reveal a New, More Sustainable Jet

Tech Shows

The Apple Core
apple-core-w

The Apple Core

Alphabet City
alphabet-city-w

Alphabet City

CNET Top 5
cnet-top-5-w

CNET Top 5

The Daily Charge
dc-site-1color-logo.png

The Daily Charge

What the Future
what-the-future-w

What the Future

Tech Today
tech-today-w

Tech Today

Latest News All latest news

OnePlus Unveils OnePlus Pad
oneplus-image

OnePlus Unveils OnePlus Pad

OnePlus 11 Review: Powerful but Not Perfect
oneplus-11-review-cnet-lanxon-promo-17

OnePlus 11 Review: Powerful but Not Perfect

Hot Wheels Rift Rally Turns Your Living Room Into a Video Game
hotwheelscms

Hot Wheels Rift Rally Turns Your Living Room Into a Video Game

Big-Screen Gaming With the Razer Blade 18
laptop-2

Big-Screen Gaming With the Razer Blade 18

Meta Horizon Workrooms With Quest Pro: A Peek at the Future of Work
working-with-quest-pro-horizon-workroom-2

Meta Horizon Workrooms With Quest Pro: A Peek at the Future of Work

Why NASA Is Helping Apptronik Build a Humanoid Robot
apptronik-seq-00-06-02-21-still007

Why NASA Is Helping Apptronik Build a Humanoid Robot

Most Popular All most popular

Easy Ways to Lower Your Utility Bills and Save Money
yt-reduce-your-utility-bills-v3

Easy Ways to Lower Your Utility Bills and Save Money

How Healthy is Your Heart, Really? 5 Ways to Tell at Home
how-healthy-is-your-heart-2

How Healthy is Your Heart, Really? 5 Ways to Tell at Home

How to Delete or Disable Your Instagram Account
phoneonorange

How to Delete or Disable Your Instagram Account

One Day With Samsung's New Galaxy S23 Ultra
24-hours-with-the-s23-ultra-3

One Day With Samsung's New Galaxy S23 Ultra

Samsung's Galaxy S23 Lineup Is Here With Big Camera Upgrades
cnets23

Samsung's Galaxy S23 Lineup Is Here With Big Camera Upgrades

Big tech explains how it will fight foreign government hacks in US elections
senate-ceos-facebook-russian-interference-00-07-11-09-still083

Big tech explains how it will fight foreign government hacks in US elections

Latest Products All latest products

OnePlus 11 Review: Powerful but Not Perfect
oneplus-11-review-cnet-lanxon-promo-17

OnePlus 11 Review: Powerful but Not Perfect

Hot Wheels Rift Rally Turns Your Living Room Into a Video Game
hotwheelscms

Hot Wheels Rift Rally Turns Your Living Room Into a Video Game

Samsung's Galaxy S23 Lineup Is Here With Big Camera Upgrades
cnets23

Samsung's Galaxy S23 Lineup Is Here With Big Camera Upgrades

Testing Apple's New M2 MacBook Pro and Mac Mini
macbook-pro-and-mac-mini

Testing Apple's New M2 MacBook Pro and Mac Mini

Hands On: Google Android Auto in Volvo's New EX90
google-booth-seq-00-08-05-25-still002

Hands On: Google Android Auto in Volvo's New EX90

Hands On: Acer's 3D Stereoscopic Screen
c0270-mp4-02-36-54-21-still001

Hands On: Acer's 3D Stereoscopic Screen

Latest How To All how to videos

Connect a Meta Quest 2 VR Headset to a PC
pc-vr-5

Connect a Meta Quest 2 VR Headset to a PC

Cast Your Meta Quest Headset to a TV, Phone or Browser
cast-2

Cast Your Meta Quest Headset to a TV, Phone or Browser

MacOS Ventura Continuity Camera Turns Your iPhone Into a Webcam
1203246975312353-pnmdl8bwygpxcjffhlcf-height640.png

MacOS Ventura Continuity Camera Turns Your iPhone Into a Webcam

How to Clean Your Keyboard's Sticky Keys
3keyboards

How to Clean Your Keyboard's Sticky Keys

How to Play Games from PlayStation Plus on PC
psstill

How to Play Games from PlayStation Plus on PC

How to Delete or Disable Your Instagram Account
phoneonorange

How to Delete or Disable Your Instagram Account