I'm stopped in traffic, waiting for a left turn light, when another car sticks its front corner into the space between my car and the one in front. The car I'm in politely stops to let the other car in, then follows it through the left turn.
Remarkably, a computer drives the car I'm in, as it is one of Delphi's self-driving research vehicles out on a demonstration ride during CES in Las Vegas.
The Delphi car, an Audi SQ5 SUV, carries a load of sensors comprised of LiDAR, radar and cameras. I've ridden in an earlier version of this car, which did an admirable job of driving suburban streets. Now, however, Delphi added technology from Mobileye, a computer vision company, which includes a tri-focal camera behind the windshield to identify objects ahead, such as other cars and pedestrians.
Glen DeVos, Delphi vice president of engineering, tells me that where cameras only accounted for about 10 percent of the sensor load on the previous version of the car, in this build they handle an equal amount of the sensor input as the LiDAR and radar sensors.
A red hot area of research in the automotive industry, self-driving cars show potential to greatly reduce or eliminate the tens of thousands of deaths that occur on US roads every year. This technology may also reduce traffic jams, a major fuel and time waster in US cities. Along with automakers, equipment suppliers like Delphi, startups and big tech companies like Google are all developing self-driving car technology.
While riding through Las Vegas, DeVos shows me different representations of the car's sensor input on its dashboard screen. The camera view picks out all the objects it recognizes in green boxes, letting me see whether it has identified cars and pedestrians. A sensor fusion view, generated by the car's computer from its LiDAR, radar and cameras, overlays colors on the different objects in the environment that it recognizes, with cyan for people and grey for other vehicles.
While I watch how the car distinguishes lane lines, DeVos points out a thick green line overlaid down the center of the lane. This, he says, represents Mobileye's Road Experience Management (REM) technology. Essentially, REM is a crowdsourced path down the lane, generated by every other REM-equipped car that has driven down the road.
Using REM, a self-driving car does not merely rely on its own sensors, but can take into account the best aggregated path that previous cars have driven. With enough such cars on the road, when one car adjusts its path to avoid an object in a lane, that information gets shared with every other car. The other cars will know to prepare to adjust their paths accordingly, but will still corroborate the path with their own sensors.
During my ride, I was impressed how the car handled the rude driver cutting in, which was an unexpected circumstance, along with a tunnel, blocking the GPS signal, and a freeway merge, where the car had to accelerate up to 65 mph while judging its distance from other traffic.
DeVos says that Delphi's self-driving platform requires more years of development, but could be production-ready by 2019. If that time frame holds, it could make its way into production cars by 2020 or 2021.