With an underwhelming 2018 in the books for Apple's self-driving car program, this new NHTSA report seems to show the company is serious about self-driving car safety.
After a less than stellar showing in California's mandatory annual disengagement report, Apple released a white paper (PDF) on Wednesday to the National Highway Traffic Safety Administration outlining its testing procedures for its self-driving car program.
While most of the document reads like boilerplate for anyone testing a self-driving car, the part that we find most interesting is the section that outlines Apple's requirements for its test vehicle's human safety driver.
To qualify as a safety driver for Apple's self-driving car program, you need to have a driving record free of serious accidents, DUI convictions, suspensions or revocations for at least the last 10 years. Once these requirements are met, prospective drivers need to pass a drug screen and a background check.
Once hired, Apple safety drivers can expect to be subjected to a barrage of training and evaluation beginning with a defensive driving course followed by vehicle-specific training and testing to help sharpen their reflexes.
Safety drivers are made to take regular breaks while driving and are paired up with an operator who sits in the passenger seat whenever testing is being undertaken. Drivers are also given regular anonymous surveys to determine the mental and physical impact that their work has on them.
Roadshow contacted both Uber and Waymo to see what their requirements for being hired as a safety driver are (and were, in Uber's case) as well as what kind of ongoing training and evaluation programs were put in place for those drivers. Neither responded immediately to requests for comment.
While Apple's disengagements during its 2018 testing (871.65 per 1000 miles traveled) initially read as shocking, it's important to remember that Apple is new to the field relative to the likes of Waymo or Cruise.
Further, disengagements in the comparatively early stages of testing aren't the end of the world as long as a qualified human safety driver is there and alert enough to correct the car's mistake.
Sherif Marakby, CEO of Ford Autonomous Vehicles, echoed this sentiment last year when our Editor-In-Chief Tim Stevens went to Miami to try Ford and Argo AI's self-driving car: "We see interventions in a very different light. We see it as a good thing... We record that data and put it into simulation so we're not risking anything on the road. We can do it in the computer."
Apple didn't immediately respond to requests for comment.