X

No single solution to self-driving cars' trolley problem, study says

Respondents disagree on whom an autonomous car should kill in the event of an unavoidable crash.

Jake Holmes Reviews Editor
While studying traditional news journalism in college, Jake realized he was smitten by all things automotive and wound up with an internship at Car and Driver. That led to a career writing news, review and feature stories about all things automotive at Automobile Magazine, most recently at Motor1. When he's not driving, fixing or talking about cars, he's most often found on a bicycle.
Jake Holmes
2 min read
Waymo

It's a question that both proponents and detractors of self-driving cars like to pose all the time: If a crash is unavoidable, whom should an autonomous vehicle save and whom should be killed? MIT set out to help answer that question by surveying 2.3 million people. 

The biggest takeaway: Respondents from different countries, cultures and social groups all answered the questions differently.

Ford autonomous vehicle prototype

If it can't avoid a crash, how should an autonomous vehicle decide what to hit?

Ford

"People who think about machine ethics make it sound like you can come up with a perfect set of rules for robots, and what we show here with data is that there are no universal rules," said MIT computer scientist and study co-author Iyad Rahwan.

The study, whose results were published today in the journal Nature, involved an online questionnaire called The Moral Machine in which respondents from around the world were asked to pick between two outcomes in various situations -- similar to the famous "trolley problem" ethical thought experiment. For instance, should a self-driving car hit a group of children or elderly people, or should it crash into a homeless person or a corporate executive?

In a general sense, the MIT researchers found that people agreed on certain principles. In a news release Wednesday, MIT said the overall preference was for autonomous cars "sparing the lives of humans over the lives of other animals; sparing the lives of many people rather than a few; and preserving the lives of the young, rather than older people."

Toyota Platform 3.0 Autonomous Test Vehicle

The MIT study recorded 40 million answers from people in 233 different countries and territories.

Toyota

What's interesting, however, is that the study found that preferences varied depending on a respondent's location and cultural norms. People from countries with "relatively prosperous countries with strong institutions," like Finland and Japan, were more likely to say that a jaywalking pedestrian should be killed, for instance. And in Colombia, a country with high economic inequality, the study found that people were more likely to say that the self-driving car should kill a homeless bystander than a successful executive.

The study's authors grouped respondents into three geographic and cultural areas, Eastern, Western and Southern. Other differences they found: people in Eastern countries, including Asia, were less likely to favor saving young people over the elderly than North American and Western European nations. Meanwhile, people from Southern countries, which includes places like South America, were far more likely to say machines should spare the lives of women than were respondents from the Eastern or Western groupings.

The bottom line for developers of autonomous vehicles is that programming a defined set of ethical guidelines for self-driving cars will be difficult. It's easy to say that such vehicles should avoid a crash at all costs -- but when forced to "choose" between causing injuries or deaths, writing a rulebook won't be simple.

"The study is basically trying to understand the kinds of moral decisions that driverless cars might have to resort to," Edmond Awad, an MIT postdoc and the paper's lead author, said in a statement. "We don't know yet how they should do that."

Ford's 'self-driving' delivery vans bamboozle the public for good reason

See all photos