X

iRobot uses Ava test-bed to prototype human/bot interactions

Rafe gets a CES demo from iRobot Vice President Mark Chiappetta.

Rafe Needleman Former Editor at Large
Rafe Needleman reviews mobile apps and products for fun, and picks startups apart when he gets bored. He has evaluated thousands of new companies, most of which have since gone out of business.
Rafe Needleman
2 min read
Watch this: iRobot's Ava test-bed takes a stroll at CES

LAS VEGAS--iRobot makes maidbots (Roombas and the like) and warbots (the Packbot and similar), but not much in the middle. So the company is using its Ava test-bed to experiment with making robots that interact with people directly.

At CES, I talked with iRobot Vice President Mark Chiappetta to learn more about this initiative and the Ava project. What looks like a mobile stand for an iPad incorporates a lot of interesting technology. Starting at ground level, holonomic "omniwheels" let the robot move in any direction and in any orientation. In other words, it can side-step obstacles, just like you and me.

There are ultrasonic sensors around the perimeter of the base as well as a laser rangefinder in front. There's also an upward-pointing sensor that the robot uses to prevent getting "clothes lined" by a tabletop or other obstacle that its ground sensors won't see.

The tablet at the top of the bot is mounted on a post that can raise up to interact with a standing person, or lower to one who's sitting. It has its own suite of sensors, including a Kinect-like system for tracking gestures. You can also touch the tablet pedestal to raise, lower, or rotate it.

The robot is semi-autonomous. I was shown its iPad-based navigation app. As the robot moves around, it maps its environment. If you want to drive it to a location, you just touch it on the generated map, and the bot will find its way there, avoiding obstacles, including people. I stood in its way for one drive, and it navigated correctly around me (slowly, but it did it).

Chiappetta told me that the Ava is what iRobot uses to test "high-fidelity interaction with humans." The company has nothing like this yet: a robot that can actually interact with humans. I'm told iRobot is working with InTouch Health to explore using robots for telepresence in hospitals. Chiappetta says he ultimately wants to see Ava-derived bots "helping the elderly to age in place."

Ava has its limits, though. It can't climb stairs and it has no manipulator appendages ("arms" to you and me). The company does have manipulators on some of its military bots, though, and since all the iRobot products are based on the same operating system, adding some real-world interaction capabilities would not be impossible. At the very least, Ava needs some way to open doors.