X

Georgia Tech uses human arm sensors to make robots safer

A control system helps robots quickly adjust to subtle changes in the way their operators move, which could improve safety in manufacturing settings.

Elizabeth Armstrong Moore
Elizabeth Armstrong Moore is based in Portland, Oregon, and has written for Wired, The Christian Science Monitor, and public radio. Her semi-obscure hobbies include climbing, billiards, board games that take up a lot of space, and piano.
Elizabeth Armstrong Moore
3 min read
Professor Jun Ueda is working with recent Ph.D. grad Billy Gallagher to train robots to follow a humanoid's lead. Georgia Tech

Imagine you work in a manufacturing plant where your job is to hang a car door on a hinge with the help of a large robot. You're using a lever to guide the bot to the precise drop-off location, so you need to be close enough to see what you're doing. Chances are, you want this bot to be as intelligent as possible, unless of course you're willing to risk life and limb for some kind of disability pay.

Well, the good folks at Georgia Institute of Technology are working hard on the intelligence factor -- devising a control system to make robots smarter and ultimately safer.

The researchers are using arm sensors on mortal humans (mortality being a weakness that's tough to work around) to train the robots to mimic, respond to, and even predict the person's movements so that the two can work side-by-side more seamlessly.

"It turns into a constant tug of war between the person and the robot," Billy Gallagher, a recent Ph.D. graduate in robotics and project head, said in a school news release. "Both react to each other's forces when working together. The problem is that a person's muscle stiffness is never constant, and a robot doesn't always know how to correctly react."

Unfortunately for the human, what often happens when he wants to stop a certain movement and, say, hold the lever still, is that the person's arm muscles contract and stiffen to pull back the movement, thereby confusing the robot, which has no way of discerning whether that contraction is a command to change direction or merely the bounced force that results from the muscles contracting.

So it reacts regardless, said Jun Ueda, Gallagher's adviser. And when it reacts, it creates a vibration, causing the human to react by further stiffening his arms, which in turn creates more force. You can see where this is going. The vibrations and stiffness build, and the end result is instability, right when that human is supposed to guide that robot to gently drop the door into place.

This is where the arm sensors come in. By reading the person's muscle movements, and shooting those readouts to a computer, the operating system determines the operator's actual status and relays that info to the bot to interact more smoothly with the human.

Gallagher says it's a new way of approaching how to train a robot: "Instead of having the robot react to a human, we give it more information. Modeling the operator in this way allows the robot to actively adjust to changes in the way the operator moves."

The researchers are working under a $1.2 million National Robotics Initiative grant to choreograph this more elegant tango so that those who work alongside robots in military, automobile, and aeroscope manufacturing encounter fewer problems. See the sensors at work in the video below: