From Jim Crow to 'Jim Code': How tech exacerbated racial injustice
Culture
The technology industry is built around the assumption that science and tech can solve many of humanity's biggest problems and overcome long standing obstacles to progress.
However, that narrative has been fundamentally called into question in recent years as systems like artificial intelligence.
Facial recognition have actually reinforced the racial divide and are potentially making the problem worse.
Now what?
Our guest today is Dr. Ruffo Benjamin sociologist, Professor of African American Studies at Princeton University and the author of the book race after technology.
This book is a true deep dive on the impact of technology on race relations.
And it also has positive prescriptions for how we can better design the systems of the future to overcome the current limitations and in justices.
So Dr. Benjamin, can you talk a little bit about the book that you wrote and the responsibility that the technology industry shares on this important topic?
A few years ago I was on a sabbatical and I was working on a completely different projects.
Looking more in the life sciences and populations, mixed.
And I was noticing these headlines about so called racist robots.
There was like a first wave of headlines.
That was like My god, how can robots be racist and by robots that's kind of like a heuristic for just automation and AI and technology more broadly.
And then there was a second wave of stories that seemed less surprised.
Like Of course, technology inherits its creators byesies And then slowly I started seeing a shift where people were grappling with the issues of algorithmic discrimination, machine bias and on and it was more about trying to figure out how to create tech fixes for these issues.
So as someone who was trained in sociology and specifically the sociology of science and technology, I was noticing a kind of gap between these emerging conversations and this longer study in history.
Scholars and activists who've been thinking about the relationship between Technology and Society even going back to Martin Luther King, who warned about guided missiles and misguided men.
And so there is our via kind of seed of thinking about automation, the deadly underside of automation and so I wanted to put on, This longer history of activism and scholarship in conversation with what's happening today.
And we see how it's become more and more relevant even just this week with all of the announcements of major companies pulling back their facial recognition systems from Police and policing.
And so I think it's just gonna become more and more relevant for people to be thinking critically about our digital world.
And so that's what my book is trying to bring to the fore of our conversation.
Excellent.
So there's this narrative that's been part of the technology industry for really the past couple decades that has just gotten called into question in the last few years, which is this idea that It is this force of change and this force of progressive values that we will overcome a lot of the challenges that have held humanity back in the past.
And we've seen that narrative really get called into question in the last few years, especially around the the issue of race you mentioned.
The facial recognition, because the people designing it are not diverse.
They have biases that maybe they don't have any intended consequences of Discriminating but their own biases, their own lack of vision and a bigger vision has led them design these systems that now are causing a lot of challenges for people of color that then they get pulled over more.
They get flagged for offenses more.
And so it's reinforcing, if not making worse What was already a system that was that was difficult to deal with if you are a marginalised person a person of colour.
And so in your, in your TED talk and this goes back to 2015 it you you said at the heart of discriminatory design is the idea that we can create technological fixes for social crisis.
Rather than dealing with the underlying conditions, we create short term responses that get the issue out of sight, out of mind.
And so I thought maybe you could talk a little bit about that because I think that also even though No, this is 5 years ago, it get to the heart of exactly the challenge we're facing.
Absolutely, and so what you described as a kind of backlash against the promises of big tech or what some people call the tech lash.
Is in part because that narrative has been so powerful, that narrative coming out of Silicon Valley that Technology is gonna save us, right?
We just have to allow the experts to figure out how to deal with it.
But there's also another narrative that goes hand in hand with that one, that has maybe been around even longer which comes out of more the Hollywood world, which is.
Technology's gonna slay us.
Right Terminator.
Yeah.
On the surface these seem like opposing stories, right?
One is it's our saviour.
One is it's going to devour humanity.
It's going to take all the jobs, it's going to, you know, the matrix is going to come back and make us batteries.
And so it seems like opposing stories but under the surface they share an underlying logic That is that technology is in the driver's seat that we're affected by it.
It's either going to help us or harm us.
But there's a third story and that third story is what I'm trying to tell and race after technology is that technology is not in the driver's seat, we need to pull back the screen and we need to look at who's actually developing the technology and within what kind of incentive structure What kinds of ecosystem is technology being developed in and the fact is much of our technology is being developed and conceived of by a small sliver of humanity.
And this sliver of humanity has projected onto everyone else, its own vision of the good life.
[LAUGH] What it considers to be efficient even.
Even unbiased as one of the promises that comes out of the neutrality of this, particular population and that is correspondent with the invisibility of whiteness, and the hubris of patriarchy.
In fact, when we think about, we know what's better for you than you do.
And so I think what we have to do is to think carefully about this third story who's behind the screen.
And not only consider diversifying who is behind the screen that's important, but it's not sufficient because if the ecosystem remains the same if the context and the incentive structure in which That diverse workforce is developing technology remains the same.
And by that, I mean where the profit, imperative trumps other kinds of public goods, then you can have as diverse a workforce as you want, and you will still gonna get many of the same problems that we see today.
And so when it comes to for example, you mentioned Facial recognition.
That facial First of all, the facial recognition systems don't work very well.
And yet they're so bought and sold across our society and every institution not just policing.
So what that tells us is that stuff can not work and still be profitable.
That's the first thing.
But the other thing is even if it worked perfectly.
Even if it didn't have all these false positives, most of which are false positives of darker skinned people, which is even more dangerous when we think about the current climate, but even if it worked perfectly if it's operating within institutions that engage in discrimination that are breaching people's privacy rights Then even a technology that works well is dangerous and maybe even more dangerous than by working well.
And so part of what I want us to think about is not just focusing our attention on creating better technology, that's less bias.
That works better.
Thinking about the entire ecosystem.
What is it?
What would it mean to develop technology in the public interest for the public good, not just as rhetoric because everyone developing technology says this is going to be.
Better humanity this is gonna save humanity, I mean literally in the incentive structure in the economic and social governance of technology.
That we create an ecosystem that doesn't rely on the good intentions of an individual designer.
Where we say I hope they're looking out for the rest of us and then we sort of turn our backs.
But actually ensures that the stuff that's being developed is in service to the public good and the common good and not the bottom line, not shareholders, not private interests that will continue to profit off of things like racism and sexism, which are currently built into the design of so many of our automated systems.
And that wouldn't really have to happen at the individual level, right?
It would have to happen at the level of leadership and in some of these companies as well as boards of directors and all of those things, because we have these current very hierarchical structures.
You know, those companies Reflect to the hierarchical nature of our society.
And so is that what you're saying?
Do you think that's the-
I think that has a role to play and what you're in so kind of like you're talking about the internal structure of companies, the culture of these companies, certainly.
Has a role to play.
But in some ways, we can't wait for that to happen because they're not there's no incentive for them to really make these dramatic changes.
We would.
That's where the role of protest and public.
Demand for certain changes.
And also governance like really what does it mean to have public accountability that's independent of these companies.
So we need the internal transformation, but we really need to think about what does a governing structure look like?
That's not, As corrupt as what we have in Washington now but also not as self interested as what we have in Silicon Valley.
And so we need something that is, apart from both of those really unsavory ways of thinking about how to create a good ecosystem.
So that really does get to accountability.
It's one of the biggest parts of the narrative going on in the world today, accountability, whether it's Police departments or governments or all those who are using their influential powers, or ways of working with people, that are often,not concerned enough with social justice.
Yeah, i mean accountability starts with even the very basic idea that we should know.
What systems are in use in our life?
The systems that are making decisions about us, that we don't even know about.
Tomorrow, the city of New York is passing a bill that would demand that the New York PD reveal its use of facial recognition, how it's been used, to what ends The fact that you'd have to pass a whole bill just to get a public agency to reveal how it's using technology, tells us that so much of this is shrouded in secrecy.
That so we need to move those veils out of the way because even before we can begin to question that we just need to know what's actually happening and in fact In many areas of our lives, not just policing in our employment context, in our hospital context, our educational context, those who are in charge have been adopting technologies, thinking that it's gonna make things more efficient thinking it's gonna make things more fair and it's really just deepening and hiding the problems and we need to bring all of that into the light right now.
Excellent.
So one more question for you.
Thank you so much.
Let's talk a little bit about the effect of technology on jobs because technology is transforming the job market right out from under us and it's sometimes overlooked and it has huge effects.
And what is the effect on that?
And social justice, and these types of things because then obviously the jobs of that are being created are often in technology, and the jobs that are being eliminated are often.
Jobs that are held by people of lower income that are sometimes at the most risk.
And so we would love your perspective on that.
Yeah, absolutely.
So there's two different ends of this.
This that I think are important to keep in mind.
One is this the basic idea when we talk about tech jobs we often have in our minds, the white collar jobs, the engineering jobs, the high paying jobs, right, is the reason everyone goes into STEM.
But underneath that there are a whole layer of jobs that support the tech industry that are tech jobs that aren't identified as such.
For example, the armies of content moderators around the world not just in the United States who are Filter what we actually see on social media and in our digital lives make it even usable.
We wouldn't even be able to use these technologies because we would be so disgusted [LAUGH] At what we were seeing if it were not for.
Armies of content monitors who actually, in many cases are traumatized by their work.
And so there's a great documentary on Netflix called The Cleaners that are about the people who do this and a fantastic book called Behind the Screen by my colleague that you all should look up that really shows us this whole area of tech labor.
We don't consider so that's one thing to keep in mind.
The other is all of the kind of blue collar work.
For example, Amazon warehouses that not only rely on these technology platforms, but then deploy it against their own workers automating work using algorithms to keep track of how.
First people are producing that makes the labor even more oppressive than it already is.
So that we know we have accounts on Amazon warehouses in Minnesota where the mostly Somali workforce have to hold up signs and protests that say I am not a robot, because they feel like their own flesh and blood is being automated.
And so there's that deployment of technology back on the tech workforce, then there's a whole other area that I think many people may start to relate to more which is the way that Even at the point of entry, when we're being hired for work, that process is being automated that your interview is not even with a person.
It's with a system that's now judging you according not just to what you say and how you act but thousands of data points that's being compared to existing top performing employees in these organizations.
So if for example, many of these organizations have engaged in discriminatory employment practices, let's say for the last 50 years that makes it homogenous according to race and according to gender, and you're being compared to them, according to these so called neutral data points.
What's happening is that those same divisions, those same forms of discrimination are being reproduced under the guise of neutrality.
And that's what I call the new gym code because now you don't have a sign hanging in the door that says, blacks need not apply.
You just do your interview [LAUGH] online and then you're Your posture, your vocal tone, your eye Twitch.
Everything has been compared to a homogenous workforce, and then your application is filtered out.
So what I'm trying to say is that technology and employment they intersect at so many different points and what we have to do is rid ourselves disabuse ourselves of the idea that technology necessarily makes things fair because often Doesn't it hides forms of oppression and discrimination under the guise of this kind of progress that's technologically mediated so we need to wake up.