X

Robo-cars face a new threat: Lawyers

Robot enthusiasts debate ways to protect self-driving cars and other autonomous machines from the looming existential threat of class action lawsuits.

Declan McCullagh Former Senior Writer
Declan McCullagh is the chief political correspondent for CNET. You can e-mail him or follow him on Twitter as declanm. Declan previously was a reporter for Time and the Washington bureau chief for Wired and wrote the Taking Liberties section and Other People's Money column for CBS News' Web site.
Declan McCullagh
4 min read
Self-driving cars, including Google's, could be run off the road by the existential threat of trial lawyers.
Self-driving cars, including Google's, could be run off the road by the existential threat of trial lawyers. Getty Images

STANFORD, Calif. -- Self-driving cars are expected to save lives: a vehicle driven by a human will experience, on average, a crash every 160,000 miles or so. It's only a matter of time, advocates say, before robots become better drivers than us.

That is, if the lawyers let them. Industry insiders are already fretting about a host of legal problems that could bedevil robot car makers once a sufficient number of their creations take to the roads. Product liability, tort law, negligence, foreseeable harm, patent encumbrance, and design defects are only some of the concerns.

"The longer it takes for this technology to reach the market, the more people die," Josh Blackman, a law professor at the South Texas College of Law, said yesterday at a conference hosted by Stanford University called "We Robot: Getting Down to Business."

Self-driving cars: Where do they go from here? (pictures)

See all photos

Over 100 of the brightest thinkers about self-driving cars and other robotic machinery gathered here this week to debate legal snarls that could ensnare these machines. Papers presented included ones with titles "Human Factors in Robotic Torts," "Risk Management in Commercializing Robots," and "Application of Traditional Tort Theory to Embodied Machine Intelligence."

"The longer it takes for this technology to reach the market, the more people die," law professor Josh Blackman, left, said at the "We Robot" conference yesterday.
"The longer it takes for this technology to reach the market, the more people die," law professor Josh Blackman, left, said at the "We Robot" conference yesterday. Bryant Smith, a resident fellow at the Stanford Center for Internet and Society, is on the right. Declan McCullagh/CNET

Cars are "the second-most dangerous consumer product that's available to be sold," after cigarettes, said Brad Templeton, a consultant who has worked with Google in the past on its self-driving car project. "We're making it a safer product. Robots won't make the same mistake twice."

One option is to enact laws to limit the liability of a robot manufacturer, a choice that saved the U.S. general aviation industry from collapse when President Clinton signed into law in 1994 a measure that curbed companies' legal exposure. But it will surely be opposed by the plaintiffs bar, which happens to be one of the largest donors to the Democratic Party, and the idea isn't uniformly supported -- at least not yet -- even among robot enthusiasts.

"Is it appropriate to have a federal legislative response to the liability question?" said Bryant Smith, a resident fellow at the Stanford Center for Internet and Society. "At this point I'm fairly agnostic."

Allowing self-driving cars to share the roads with self-driving teenagers will likely result, in some cases, with human-initiated attempts to see just how quickly a robot can apply the brakes or how it will respond to attempts to run it off the road. Or programming errors could cause an accident that results in human injury or death.

Once that happens, headlines about self-driving cars causing accidents will "set back the movement significantly," Blackman says. That could result in a legal crackdown on a useful technology that will save far more lives than it takes.

It's a legal conundrum not limited to self-driving cars. Once robots progress beyond the Roomba level of utility to more powerful machines able to perform more domestic chores, they may be able to harm or kill a human. If that happens, lawsuits over home robots are equally inevitable.

One way manufacturers can limit their liability is to lock down their platform -- the "closed robot" idea -- so it can't be modified. Open robots, on the other hand, may be far more useful and fun, and spur the kind of rapid prototyping that has led to innovation in other areas of computer hardware and software.

Ryan Calo, a professor at the University of Washington law school, has proposed extending selective immunity to robot makers in much the same way that Congress has provided selective immunity to firearm manufacturers and (through the Digital Millennium Copyright Act and the Telecommunications Act of 1996) Internet services providers. Calo suggests that the immunity would only apply when "it is clear that the robot was under the control of the consumer, a third party software, or otherwise the result of end-user modification."

Diana Cooper, a law student at the University of Ottawa, said yesterday that an "ethical licensing model" that borrows concepts from the Free Software Foundation's GPL is another way to protect open robotics. Her proposal (PDF) would require robot owners to buy insurance and prohibits certain applications, meaning that if an end user violates the restrictions, the manufacturer would not be liable.

The license also aims to deter sexbots. "The prohibition of the sexual enslavement of robots," Cooper says in a paper (PDF) presented yesterday, "prevents the dehumanization of intimacy in male-female interactions."

All this is a bit speculative, argues Curtis Karnow, a superior court judge in San Francisco. He suggests that we don't have to worry too much about true machine intelligence yet: "Most of these products do what they are told to do, in the way they are told to do it... Unintended injuries are often just the result of human error and poor workplace design."