Stanford University earlier this year consolidated its automotive research workshops into a single building on the edge of campus. Funded in part by Volkswagen (hence the name, the VW Automotive Innovation Lab, or VAIL), the building researchers work on projects with a variety of car and equipment manufacturers (Nissan, Bosch, Honda, etc.), as well as other partners like State Farm insurance.
An enclosed bay has a shell of a car with a projection system in front of it, for testing human reactions to driving situations. Stanford professor Clifford Nass has been using this rig to begin to answer the question, "How is a driver going to communicate with an autonomous car?" One thing Nass has discovered is that a car's control system personality has to mesh with the driver. A "happy" car--one that greets the driver with a cheerful "let's go!"--paired to a grumpy driver will be dismissed and not taken seriously.
According to Beiker, Nass discovered that when the "mood" of the car matches the driver, the driver is more likely to pay attention to the car and thus drive more safely. How to set the car to the driver's state of mind is a topic still undergoing research.
This is the Apogee solar car, which competed in the race across Australia in 2009. In this photo, the solar component of the car--the top shell, which houses the solar panels--is missing. In front you can see the yellow battery pack, which project manager Nathan Hall-Snyder (in picture) told me was good for about 50 miles. Hall-Snyder says the research for the Apogee battery pack also informed the design of the first Tesla sports cars.
On batteries, I drove the convertible Apogee around the parking lot a few times. With the solar cells on and sun overhead, the car can cruise at 50 mph all day.
The Apogee was cool to drive, but not fun. The chattering drive electronics are noisy, the steering and brakes feel wooden, and my leg quickly tired from being jammed into the one spot where it could operate the pedals. I suspect the car's official driver is shorter.
Professor Chris Gerdes gave me a walk-around of the P1, his team's first modular testbed. It was designed to prototype "steer by wire" technology, and in its current incarnation has no link between the steering wheel and the front wheels, nor are the front wheels mechanically linked together as they are in all cars on the road today. Independent control of the wheels yields some weird bonuses: the car can turn its wheels inward to snowplow like a beginning skier to aid with braking.
Gerdes says the steering motors also provide rich feedback on the grip available to the tires and can provide more accurate and subtle "envelope control" to the control system than today's "stability control" computers that only kick in once a car starts to lose control.
The successor to the P1 is the X1, a more car-like (read: comfortable) modular testbed designed for testing four-wheel steering. Like P1, the X1 uses multiple GPS receivers (mounted on the roll bar) to provide data on position, direction, and attitude. Gerdes says the differential data is more accurate at telling the control systems if the car is sliding (moving sideways) than inertial sensors, and can even provide data about the state of tire inflation.
Stanford's Pikes Peak Audi (developed with other industry partners) was not at VAIL, as it had only just made the run up the Pikes Peak road without a driver and was still cooling down in Colorado before being shipped back.
The Audi is a more photogenic autonomous car than the self-driving VWs from Stanford's previous entries into the DARPA Grand Challenges, and part of the reason is that the Audi has no computer vision technology. While it ran up the Pikes Peak mountain road with a detailed map in its system, and it uses GPS to locate itself, in addition to receiving data from wheel-spin and other sensors so it could drive at the absolute limits of control, it would not, as Beiker told me, be able to steer around a boulder if one was dropped on the road in front of it.
Stanford has built two autonomous Volkswagons for competing in DARPA's self-driving car challenges. They're not as fast as the Audi, but they use vision technology in addition to GPS and other data to see their environment.
Stanford is currently researching technology to gauge intent of pedestrians that a car scans while driving. For example, if a car at an intersection sees a person standing on a curb facing toward the street, it will act on the assumption that the person could step into the car's path. If the person is facing the other way, it can discount that possibility--but not entirely.