Ever since the introduction of the smartphone, automakers have been playing catch-up, trying to introduce similarly useful electronics in the dashboards of new models. With even the newest model cars hobbled by slower development and production timelines, however, drivers found the navigation apps on their phones more capable and up-to-date then the ones in their dashboards.
From what I saw at last week's CES 2015, it looks like we are at a tipping point, where automakers have not only caught up, but will surpass smartphone tech.
At the Nvidia press conference the Sunday before CES, CEO Jen-Hsun Huang introduced the company's latest mobile processor, the . Pointing out that the X1 can power multiple high-definition displays, he said it was overkill for use in smartphones. The rest of the press conference showed the chip's application in cars, powering instrument cluster and infotainment displays, and autonomous driving systems.
Huang left me wondering how much processing power my smartphone really needs. The display is never going to get bigger than that needed for a handheld device, and it already accomplishes onboard computing tasks faster than it can get data over its 4G connection.
Qualcomm took a similar tack. It showed a concept car using its automotive grade Snapdragon 602A chip running both an LCD-based instrument cluster and large center LCD for infotainment. Another 602A was employed processing video feeds from external cameras for side and rear views in the car, along with other driver assistance functions.
Where cars of the future will benefit greatly from multiple sensor feeds including video, radar and laser scanners, phones will likely never go beyond a camera or two.
Automotive tier-one suppliers are also in the game, enabling higher-powered chip use in cars. Harman International showed me its Oakland platform, a dashboard system designed for premium cars that supports multiple displays running off the same chip. Harman remains agnostic to what brand of chip an automaker might want, or even what software runs on the system, although it supplies minimum specifications for what an automaker wants to achieve. With Oakland, Harman not only demonstrated multiple displays, but also an app store, head-up display, driver assistance systems, a gesture control interface, and even support for bothand .
Those latter two systems have been greatly anticipated by those who like their smartphone features, but may be already coming late to the game. Talking with Ricky Hudi, Audi's chief executive engineer electrics/electronics, I asked him if he felt we were at a point where the car is getting smarter than the phone. He replied emphatically that we are already there. And looking at Audi's latest models, I'm not surprised at his opinion. The newand employ the Nvidia Tegra 30 chip and 4G/LTE data connections, powering what Audi calls its Virtual Cockpit, a big LCD in place of the instrument cluster showing virtual gauges and navigation maps. The Q7 includes an additional infotainment display on the dashboard and integrates two tablets for passengers to enjoy personal entertainment, program music to the car's stereo and enter destinations for navigation. The new Q7 will support Apple CarPlay and Android Auto, but I frankly wonder if those features will get much use from owners considering the richness of Audi's connected features, including Google Earth integration with the car's navigation system.
Given their limited form factors, smartphones may have already hit a plateau as to capabilities, yet cars have a bright future for continued development. Consider the multiple displays hitting cabins, adoption of 4G/LTE data to run connected features and rapidly advancing self-driving features requiring a processor to build a real-time virtual environment based on sensor data. The, unveiled at CES, was a perfect expression of where automotive technology can go. I can't envision a similar future concept for a smartphone.