Denso camera tech detects distracted walking
At Nvidia's developer conference last week, auto supplier Denso showed image-processing tech that can tell if you are staring at your phone.
In a presentation at Nvidia's GTC developer conference last week, two engineers from Japanese automotive equipment supplier Denso showed their work on processing imagery so a computer could distinguish and predict pedestrian behavior. This technology would have huge implications for automotive safety and future autonomous cars.
Denso engineers Ikuro Sato and Hidek Nihara gave the presentation, titled "Beyond Pedestrian Detection: Deep Neural Networks Level-Up Automotive Safety." They outlined how image-processing software could examine the frames of a camera feed and extrapolate what they called "hidden data." The software first breaks down each frame, as a still image, into multiple layers, helping define edges that will ultimately help a car's computer determine what sort of pedestrians or objects are in its field of view.
The goal for the research is to build a system that can not only tell the direction and speed of pedestrians, but also define which ones might be elderly or staring at a smartphone screen.
As an on-site demonstration, Nihara walked in front of a live camera, the feed being processed to show an overlay of his height and in which direction he was facing. The video feed was being processed by development software running on Nvidia's new K1 graphics processor, announced at CES earlier this year.
With this technology, automakers could install systems that would pop up alerts when a pedestrian was likely to enter a car's path. Autonomous cars could become better at understanding the variety of threats in their immediate environment and be able to negotiate crowded urban streets.