How bias gets coded into our technological systems
How bias gets coded into our technological systems
24:39

How bias gets coded into our technological systems

Tech Industry
Artificial Intelligence now plays a key role in deciding who gets jobs, who gets into colleges, who gets loans, who gets accused of crimes and so much more. But recent work from researchers has shown that the algorithms driving AI are inheriting and in some cases, even accelerating The biases behind inequality and injustice in our society, especially, against women and people of color. Well, now a new documentary on Netflix called coded bias tackles this issue head on, and we've got the director solid he can tie it here to talk about it. So Shawny, tell us what coated bias has set out to do with this new documentary. Well, thanks so much for having me. Coated bias follows MIT researcher Joy Bell and Winnie's startling discovery, that commercially available facial recognition does not The dark faces or women accurately and sort of takes us down the rabbit hole of an exploration of widespread bias. And algorithmic harms in so many of the technologies That we interact with every day. It's important to say that three years ago, when I set out to make this film, I didn't even really know what an algorithm was. Everything that I knew about artificial intelligence sort of came from the mind and imagination of Sci-Fi creators like Steven Spielberg or Stanley Kubrick, or [UNKNOWN] Scott and, I hadn't realized exactly what you had said in your opening the extent to which algorithms machine learning AI is increasingly becoming a gatekeeper of opportunity. Deciding such important things as who gets hired, who gets what quality of health care, increasingly even who gets the vaccine, how long a prison sentence someone may serve. And so as I started to understand the extent to which we are outsourcing our decision making, to machines through a groundbreaking book of Cathy O'Neill called weapons of mass destruction. I simultaneously came across a TED talk that Joy gave and began to realize that these same systems that we're trusting so implicitly with decisions that is sent Essentially our changing human destiny have not been vetted for racial bias, or for gender bias, or more broadly that they won't hurt people have unintended consequences and caused harm, and in some cases they haven't even been vetted for some Standard of accuracy that is shared outside of the company that stands to benefit economically. And that's when I really began to realize that everything that we love as people have a democracy Access to fair and accurate information, fair elections, equal rights, fair housing, Fair Employment, all the gains that we've made over 50 years, in terms of civil rights could be rolled back in the name of trusting these algorithms to be neutral when they're not. And that sort of set me on the journey to make the film and kept me engaged in the years making it. What's being put forward is this idea that the algorithms are choosing it, the computers, it's gonna remove bias from the equation. That's like the assumption But when you dig a little deeper really what your documentary has done is expose that in the not only has it made it less Bias. But in some cases,actually more bias. So can you talk a little bit about that?>> We have some misconceptions about what AI can do and can't do. And I think when it comes to artificial intelligence, that all of the knowledge has been in the hands of the few. So all the Power, has been in the hands of the few. And it wasn't really, until making this film, that I could even start to discern, what is actual science. And, when big corporations, are, selling us a bag of tricks, some bogus, baloney, pseudoscience. In the name of profits, and I think we tend to think of technology kind of like our gods, and they're sort of more like our children, reflections of ourselves and even the flaws in ourselves that we are not wanting to reproduce and I really came to understand, first of all that as human beings bias is an innate human condition. It's not just in some bad people. It's in all of us, and it's unconscious to all of us. And those biases can get encoded in the technology that we're programming. And sometimes even in spite of the best intentions of the programmers, that's what's so scary about this stuff. Take for instance, an Amazon sort of hiring algorithm that was used to Ideally make the world more fair. Let's take the human beings and human bias out of the equation and leave it up to the machines. And that AI system was looking at who got hired, who got retained over a number of years and who got promoted. And lo and behold, the machine began to Essentially discriminate. Pull out every applicant that it could discern was a woman. And that happened even in spite of the best intentions of the programmers. And that goes to show just how problematic it is to make these predictions based on data on a And a past that is written with systematic inequalities. It doesn't take too much to understand why these things happen. Machine learning and even what's now called Deep Learning, which is a lot of what AI is about where programmers aren't programming every instance in but what they're doing is Teaching algorithms, how to learn and discern patterns and then to replicate those patterns. And so you can see if you fed a bunch of data, just like you're saying about what applicants were successful or which ones got promotions, Which one, you know maybe had went on to earn more. All of these things would naturally code in some of these biases that we know have been part of the system for a number of reasons that have been explored by social scientists and people working in corporate HR departments. The algorithms in that case are just then taking that and accelerating that and saying, Great, we want to we want, but candidates that look just like this or are just like this. Absolutely. And I think what is so scary about this process is that sometimes we don't even know an automated decision maker has sort of denied us an opportunity and these systems are complex. Completely opaque to us. Just to give another example along the lines of what you're saying, Steve Wozniak, one of the founders of apple and his wife applied for the same Apple credit card. And Steve Wozniak was complaining. Why did my wife get a lower Credit score that I did on the Apple credit card, when we have all the same assets, we have all the same income. And it could be that the scoring system is sort of picking up on the fact At that maybe women had have had a shorter history of access to credit, a shorter history of access to mortgages in this country. And what is so frightening about that is I assume Steve was an ex wife is going to be okay. But for the rest of us, we don't know why we are being denied credit or have a less of a line of credit than a man who has the same background as we do. And you can see the way in which the. This could infringe on civil rights advances that we've worked so hard for over 50 years. My other favorite line in the documentary is data is destiny. And so that gets to some of what you're talking about. And there are the obvious economic parts of what we're talking about. But there also are some other kind of scary societal ones. Maybe you could talk a little bit about the apartment complex in Brooklyn. What is so scary is when Democracies start picking up the tools of authoritarian states with no democratic rules in place. And that's what happened in a housing complex, just not far from where I live in Brooklyn where you know Brooklyn, third generation of Brooklyn Heights in that building, Trinity Moran and Acimia Downes didn't even know what biometric data was when their landlord tried to install it in their housing The complex and they're building. This is a building that already had cameras and a security guard and key fobs. And it was unknown to them why they needed this other level up. And again, these people are not going into Maximum security prison. They're just trying to go home and live their lives a dignified life. And it's important to say that the same landlord actually owns property in upper income communities in Manhattan. But that's not where he tried to put the facial recognition system in. And what is so helpful is not only did Tournay Moran and Isma Downes organise their friends and their neighbours to fight against their landlord installing this kind of racially biassed invasive surveillance technology. You The place they live, and one, they kept the landlord from doing that. But they also inspire the first legislation in the state of New York that would protect other residents from that same invasive surveillance. That goes to show sort of why make documentaries because you It reminds me that everyday people can make a difference and that not everybody who is a superhero wears a cape. And one thing that I hope coated bias does is pull out a chair for all of us and give us all a place at the table because these systems are deployed on all of us. And then they are impacting more and more of our civil rights. So it's imperative that we all just get literate about the systems that will define our future. The part that was explored in the documentary on this was this idea of a very famous idea in the tech industry that the future is here. It's just not evenly distributed. And we often think that to mean that people who have money or inside access or are Live on the cutting edge that they have early access to all of these technologies and then eventually it kind of trickles down to everybody else. But what the documentary turned that on its head saying that in this case like with AI, it's actually lower income people, people who have less access to Politicians and influence that that the technology is being deployed against them on them and then being essentially used to understand what the Let's dumps whether it's law enforcement or corporations trying to profile people for commercial reasons, and then learning with the idea that eventually deploy it to everybody else. That was really interesting and I think a powerful insight from the film. Yeah, ->> Absolutely. You're you're citing the work costs of Virginia Eubanks who's profiled in the film and wrote a groundbreaking book called automating inequality. And it really talks about how communities of colour low income communities are often the place Place where big tech experiments quite like that housing complex where there's very little threat of people standing up for their rights. It just makes clear that we all need to understand how these systems work,>> sort of one of the bedrocks of democracy, is that, we have those rights of privacy Because if you don't know who is going to be in power and what they could use it for. Is there anything more you wanna say about that? That seems to be one of the most the biggest sort of underlying thrusts of the work. I did not realize until making this film What a complete psychological profile, can be built about each one of us by name, using data that's either brokered between private companies, or whether it's Usurped by nefarious third party actors. It is power that I have never seen in any other context at any other time in society. I mean states have wanted for years to have This kind of power, in the 2016 election, you had Cambridge analytica, create the psychological profiles by name, of the 100,000 people, it believed could swing an election. And just started marketing to you misinformation to those 100,000 voters. And what Cambridge Analytica did as a leak is Facebook's business model every day of the week and twice on Sunday. And in the film coated by as I give an example that st of two fatty sites from A study that was published in nature by Facebook where they just did a slight manipulation and it said I voted with your friends faces and it literally turned out tens of thousands of voters more. To the poll, and it just showed that Facebook could swing elections with very slight manipulations of its algorithm with without even anyone knowing. This is what is at stake here in our democracy in the film you, It was important for me to show a global context to data protection. And so I sort of show the Chinese example where you see someone, a facial recognition software. What happens when an authoritarian regime has unfettered access to your data and that's combined with a social credit score that is not just based on your behavior but your friends behavior and Everyone in democracies see that saner like and they lose their mind. But at the same time I feel that we in democracies are closer to that right reality than we may think. We're not thinking about the way algorithms What Cathy O'Neil calls algorithmic obedience training is sort of remaking our society and our behavior. And I think there's part of us that all thinks like, Cool, I could buy a candy bar with my face. I could I could pay for dinner with my face without it. Really thinking what we're losing in this sort of race to efficiency? And I think for me, it sort of begs the question is the goal of human civilization to be as efficient as possible to go as fast as possible? Or is it to build a society based on the inherent value and dignity of every human being. And if it's the latter, then we need to actually think really radically differently about the way that we're building our technology. Encoded bias, there are two examples that take place on the streets of London where the police are trialing facial recognition. And at the time the UK was part of Europe and protected by the GDPR. Which is I think the only legislation in the world that really starts to put data rights in the civil rights and human rights framework. And I think it's really important to say that I had to go to Europe to get that footage because here in the US the systems are being used by police in secret. And there's no Way that as a journalist, I would have access to that kind of, that would make that process transparent for me. I think what is so important here is that this intersects with Every right and freedom that we enjoy as free people have a democracy. If you go to a protest, and you know police can scan your face where you go if you have probation status or immigration status or you're on public assistance or vulnerable of any sort you If you post something on Facebook, and the algorithm sort of hides it to the bottom of the feed, like it did when Elizabeth Warren talked about regulating Facebook Do we have free speech? And that is what is at stake when our public squares and democracies are moving to these private corporate spaces of technologies and increasingly becoming techno pressies and we're expecting our democratic. Rights to translate and they haven't. And so we really need some policy changes to make sure that our democratic values are encoded in these technologies that will define the future. That brings me to this question that last year we had on the program, Dr. Ruder Benjamin, professor at Princeton, who wrote the book race after technology and she Talk about these kind of competing narratives in society of like, the technology will save us, right? Like it will remove bias it will make things more equal. It will solve the problems of the world and the problems of society. Then that's the Silicon Valley version. And then there's the Hollywood version of like the technology will slay us, right? We have the Terminator we have all of these narratives these films movies that show I robot So many others, and that these are competing narratives right now that are having a tug of war in our world and do you see that it feels like coded bias is is helping to sort of deconstruct a little bit of this idea as well. I love the comparison, I have to say I didn't realize until making coded bias that there's always been this sort of dialogue between science but fiction filmmakers and writers and thinkers and AI developers. And I think You know, both have been sort of dominated by men>> [LAUGH] in a particular way- yeah. It's sort of like limited homogeneous kind of imagination. And I think that when we have 14% of AI researchers are women. Literally half the genius has been missing from the conversation. And I think that our imagination about what these technologies are capable of. We've had a stunning lack of imagination. You know, I was on a panel with Kathy O'Neill and she talks about its data science. So let's treat it like a science. Let's not treat it like a faith based religion. And I think part of it is that we can't yet discern what is real science and what is pseudoscience and part of it is what Meredith Broussard calls techno chauvinism This idea that we think that technology is this white knight that's gonna save us all. And that a technological solution is always the best solution and I Would, and I think much of the cast of coded bias would challenge that notion that a technological solution is always the best one. I think often when I speak to technology companies, there's this impetus to think, It was just bad data. Garbage in, garbage out, we'll just fixed the data set and we'll come up with a perfect, super intelligent algorithm to run society. And I don't think that's what I'm saying at all. I think that it is really about creating a more humane society and creating technologies that are in service of the inherent value of every human being. And I think in terms of technology, I love tech. And in the end, it's it they're really just tools and I've seen sea change that I never thought was possible. Possible. In June 2020, I saw IBM saying that they would get out of the facial recognition game. They won't sell deploy it. They're done. They disrupted their whole business model. Microsoft said they won't sell it to you Law enforcement and Amazon said they would press pause. And I think that was sea change we never thought was possible partly because of the brave scientists uncoated bias, partly because of this wave of science communication and science literacy around these technologies and what they They can do, and partly because we had the largest movement for civil rights and equality that we've seen in 50 years in June 2020. And people were starting to make the connections between racially biased invasive surveillance technology in the hands of law enforcement with no Nobody that we elected giving oversight and the communities that are most brutalised and have the most to lose. And I think that that gives us a recipe for how change can happen. Then we need to grade science unencumbered by corporate interest. We need mass scale literacy around how these systems that impact us all work. I mean, a 10 year old is going to start using this In the fifth grade, that's when we need to start educating around tech. And we need to engage people, pushing our policymakers to protect our civil rights. And so to me We're really in a moment where the cement is still wet, where we the people have a moonshot moment to really push for greater ethics and greater humanity, and greater fairness around these technologies That will shape the future. And so it's my hope that you know people will go to the code of bias comm take action page and support a great organisation like the ACLU, the algorithmic Justice League, Electronic Frontier Foundation me heyday so many others. I've seen time and time again how a small group of people can really turn the tide. And so it's my hope that we will all participate the coated biases really that rally call that we all have a place at the table to shape how these technologies are used in the 21st century. [MUSIC] Sony, thank you for the powerful documentary for taking this important issue and really, expanding and enriching the conversation on this topic. We recommend people go and take a look at it COVID bias they can find it on Netflix and thank you so much for being here to talk with you about it. Thanks so much. It was an honor

Up Next

Bitcoin consumes more energy than many countries
nowwhat-cryptoenvironmentfinal2

Up Next

Bitcoin consumes more energy than many countries

That time Michael Dell almost had his PC company taken away
dellthumb

That time Michael Dell almost had his PC company taken away

A commercial space industry is on the horizon
nwaerospace

A commercial space industry is on the horizon

Is there such a thing as dirty solar?
solarthumb

Is there such a thing as dirty solar?

Why 'made in America' is a slippery concept
abrar-sherr

Why 'made in America' is a slippery concept

Business travelers as we knew them may be done. Now what?
nw-scott-hornick

Business travelers as we knew them may be done. Now what?

Have 5G networks underwhelmed you so far?
rogerthumb

Have 5G networks underwhelmed you so far?

A look at what's replacing the DSLR. Hint: It's (mostly) not a phone
shankland

A look at what's replacing the DSLR. Hint: It's (mostly) not a phone

Why your smart home is still dumb, and what Matter is doing about that
tobinthumb

Why your smart home is still dumb, and what Matter is doing about that

Tech Shows

The Apple Core
apple-core-w

The Apple Core

Alphabet City
alphabet-city-w

Alphabet City

CNET Top 5
cnet-top-5-w

CNET Top 5

The Daily Charge
dc-site-1color-logo.png

The Daily Charge

What the Future
what-the-future-w

What the Future

Tech Today
tech-today-w

Tech Today

Latest News All latest news

One Day With Samsung's New Galaxy S23 Ultra
24-hours-with-the-s23-ultra-3

One Day With Samsung's New Galaxy S23 Ultra

'Prehistoric Planet': The Visual Effects Behind the Apple TV Plus Show
prehistoricplanet

'Prehistoric Planet': The Visual Effects Behind the Apple TV Plus Show

Apple's Mysterious New Music App
230201-yt-apple-new-music-app-v05

Apple's Mysterious New Music App

Boston Dynamics' Stretch Robot Is DHL's Newest Helper
dhl-image-without-bug

Boston Dynamics' Stretch Robot Is DHL's Newest Helper

Samsung Galaxy S23 Ultra, S22 Ultra, iPhone 14 Pro Max: How Do They Compare?
yt-samsung-s22-s23-iphone14-pro-max

Samsung Galaxy S23 Ultra, S22 Ultra, iPhone 14 Pro Max: How Do They Compare?

Samsung Galaxy S23 Event: Everything Revealed
1203860653352361-tosb2t0zztw5ftnk6qic-height640.png

Samsung Galaxy S23 Event: Everything Revealed

Most Popular All most popular

Easy Ways to Lower Your Utility Bills and Save Money
yt-reduce-your-utility-bills-v3

Easy Ways to Lower Your Utility Bills and Save Money

How See-Through Solar Panels Could Bring Renewable Energy to Your Windows
trans-solar-panels-seq-00-06-11-19-still001

How See-Through Solar Panels Could Bring Renewable Energy to Your Windows

How to Delete or Disable Your Instagram Account
phoneonorange

How to Delete or Disable Your Instagram Account

We Tried These 48-Megapixel AR Glasses From DigiLens With Our Voice
digilens-7-005-mov-17-50-56-10-still002

We Tried These 48-Megapixel AR Glasses From DigiLens With Our Voice

Moto Edge 2022: First Look at Motorola's $498 Phone
motoedge-fl-00-02-03-08-still003

Moto Edge 2022: First Look at Motorola's $498 Phone

Big tech explains how it will fight foreign government hacks in US elections
senate-ceos-facebook-russian-interference-00-07-11-09-still083

Big tech explains how it will fight foreign government hacks in US elections

Latest Products All latest products

Samsung's Galaxy S23 Lineup Is Here With Big Camera Upgrades
cnets23

Samsung's Galaxy S23 Lineup Is Here With Big Camera Upgrades

Testing Apple's New M2 MacBook Pro and Mac Mini
macbook-pro-and-mac-mini

Testing Apple's New M2 MacBook Pro and Mac Mini

Hands On: Google Android Auto in Volvo's New EX90
google-booth-seq-00-08-05-25-still002

Hands On: Google Android Auto in Volvo's New EX90

Hands On: Acer's 3D Stereoscopic Screen
c0270-mp4-02-36-54-21-still001

Hands On: Acer's 3D Stereoscopic Screen

'Hey Disney' Comes to Amazon Echo Devices
ces-disneyalexa-00-01-11-14-still001

'Hey Disney' Comes to Amazon Echo Devices

Hearing Dolby Atmos in a Car Blew Me Away
ces-dolby-00-00-55-13-still001

Hearing Dolby Atmos in a Car Blew Me Away

Latest How To All how to videos

Connect a Meta Quest 2 VR Headset to a PC
pc-vr-5

Connect a Meta Quest 2 VR Headset to a PC

Cast Your Meta Quest Headset to a TV, Phone or Browser
cast-2

Cast Your Meta Quest Headset to a TV, Phone or Browser

MacOS Ventura Continuity Camera Turns Your iPhone Into a Webcam
1203246975312353-pnmdl8bwygpxcjffhlcf-height640.png

MacOS Ventura Continuity Camera Turns Your iPhone Into a Webcam

How to Clean Your Keyboard's Sticky Keys
3keyboards

How to Clean Your Keyboard's Sticky Keys

How to Play Games from PlayStation Plus on PC
psstill

How to Play Games from PlayStation Plus on PC

How to Delete or Disable Your Instagram Account
phoneonorange

How to Delete or Disable Your Instagram Account