X

Uber and its distracted driver at fault for fatal self-driving crash

The National Transportation Safety Board issues its final findings on what went wrong in the 2018 accident that killed a pedestrian in Arizona.

Dara Kerr Former senior reporter
Dara Kerr was a senior reporter for CNET covering the on-demand economy and tech culture. She grew up in Colorado, went to school in New York City and can never remember how to pronounce gif.
Dara Kerr
3 min read
ntsb

National Transportation Safety Board investigators examine Uber's self-driving vehicle involved in a fatal accident in Tempe, Arizona, in March 2018.

National Transportation Safety Board

Government officials said the main cause of a fatal crash involving one of Uber's self-driving cars in Tempe, Arizona, was the vehicle operator who failed to monitor the road and was distracted by her cell phone. Contributing to the incident was Uber's "inadequate safety culture," said the National Transportation Safety Board.

"Safety starts at the top," Robert Sumwalt, chair of the NTSB board, said Tuesday. "The collision was the last link of a long chain of actions and decisions made by an organization that unfortunately did not make safety the top priority."

The ruling, announced Tuesday, was a year and a half in the making. Investigators have been working since March 2018 trying to figure out why Uber's autonomous vehicle failed to detect a woman crossing the street outside of a crosswalk. 

What happened that night was captured in a video from the car's dashboard camera. The footage shows the pedestrian, Elaine Herzberg, walking her red bike loaded with bags across a dark road. The video stops at the moment of impact. The video also shows the vehicle operator, Rafaela Vasquez, sitting at the wheel constantly glancing down at her lap.

The NTSB investigators said Tuesday that Vasquez was streaming a video on her phone as she sat behind the wheel of the self-driving car. She looked up six seconds before the crash but looked back down again. The next time she looked up was one second before impact, just as the car collides with Herzberg.

Uber pulled all its autonomous vehicles from public roads in 2018 and shuttered its Arizona operations after the fatal crash. The crash was the first known pedestrian death caused by a vehicle in full autonomous mode. At that time, Uber's self-driving program fell under the scrutiny of local police, lawmakers and federal investigators. It also called into question the safety of self-driving cars overall.

"Machine perception is extremely challenging," an investigator said during a NTSB board meeting Tuesday addressing Uber's self-driving car crash. "The challenge of full vehicle automation has not yet been solved due to their current limitations."

The NTSB said Tuesday that Uber could've done more to prevent the fatality. Earlier this month, the NTSB released more than 400 pages of documents that said Uber didn't have a formal safety plan in place at the time of the crash. On Tuesday, it said Uber's lack of oversight of vehicle operators and inadequate safety procedures also contributed to the crash.

The NTSB also released 19 findings on the accident on Tuesday. Among those, the regulator said Uber compromised public safety by disabling the emergency braking systems in its autonomous vehicles and also failed to embed technology that recognized pedestrians outside crosswalks.

"We deeply regret the March 2018 crash that resulted in the loss of Elaine Herzberg's life, and we remain committed to improving the safety of our self-driving program," Nat Beuse, head of safety for Uber's autonomous vehicle program, said in an email on Tuesday. "Over the last 20 months, we have provided the NTSB with complete access to information about our technology and the developments we have made since the crash."

Uber restarted on-road testing of its self-driving cars last December. Since the Arizona crash, it's pledged to have two safety drivers in all vehicles, placed four-hour driving limits for drivers, developed a safety management system, along with other measures. The company also published a voluntary safety self-assessment in 2018.

"While we are proud of our progress, we will never lose sight of what brought us here or our responsibility to continue raising the bar on safety," Beuse said.

Sumwalt said Tuesday the entire industry needs to adopt stricter safety measures if self-driving car programs are to win over public confidence.

"Ultimately, it will be the public that accepts or rejects automated driving systems, and the testing of such systems on public roads," Sumwalt said. "Any company's crash affects the public's confidence. Anybody's crash is everybody's crash."

Originally published Nov. 19, 1:38 p.m. PT.
Update, 4:04 p.m.: Adds comment from Uber's Nat Beuse and additional background information.