CNET también está disponible en español.

Ir a español

Don't show this again


Sensing their surroundings

Sensing their surroundings

Most of the technology used by the cars of the DARPA Grand Challenge to see obstacles and find their routes is off the shelf, although not stocked in your standard computer or hardware store. GPS units have been around for a while, but the kind used in the autonomous cars are industrial units with an accuracy of 10 to 30 centimeters. The majority of competitors used units from Trimble. The GPS units were often coupled with an inertial measurement unit (IMU), which can determine speed and acceleration on all axes. An IMU can provide information about the location of the vehicle if the GPS gets interrupted.

One of the most popular types of sensors used to see the immediate surroundings were lidar units. Lidar is an acronym for light detection and ranging. These units project a laser beam, using its reflection to find solid objects. SICK makes many of the units used by the competitors. Lidar works for short ranges, so many of the vehicles also used radar as a long-range detection device. Although radar isn't as good as lidar for determining the shape of an object, having a long-range sensor gives the onboard computers more time to identify obstacles. A few competitors supplemented their sensors with stereo vision cameras, while the Princeton team relied on them. Where a laser beam can see only a narrow part of the terrain, stereo vision cameras see everything in front of the vehicle. The stereo images let the computer determine how far away an object is, using different pixel colors to indicate range. The only problem with stereo vision cameras is that they don't provide much information about what's not there, like a sudden drop-off.