Robotic arm found to work too easily

A computer program created directs a robotic arm to grab objects with just one touch, but in a pilot study the automatic mode is considered "too easy."

No, you are not reading The Onion. A computer program created at the University of Central Florida that directs a robotic arm to grab objects with just one touch was deemed by many participants in a pilot study to be "too easy" to use--a finding the designers had not anticipated.

Bob Melia (left) is a quadriplegic who advised the UCF team. Jason Greene/UCF

"We focused so much on getting the technology right...We didn't expect this," says developer Aman Behal, an assistant professor of engineering and computer science at UCF.

The computer program directs the robotic arm into action based on voice command, touch screen, computer mouse, or joystick. Sensors mounted on the arm detect an object and relay spacial data about it to the computer, which calculates how the arm should move to retrieve the object. (Watch it in action here.)

But in the pilot study, most participants far preferred to pick up objects using the program's manual mode (i.e. typing in precise instructions or verbally directing the arm through a series of commands), even though they did not perform the directed tasks as well this way.

"If we're too challenged, we get angry and frustrated," says John Bricout, associate dean for Research and Community Outreach at the University of Texas at Arlington School of Social Work who has conducted research on adapting technologies for users with disabilities and who collaborated with Behal on this study. "But if we aren't challenged enough, we get bored. We all experience that. People with disabilities are no different."

Behal is now seeking funding to develop a hybrid mode using laser, ultrasound, and infrared technology to feature a more accurate and interactive robotic arm. But his less popular automatic mode won't be thrown out; it may be the best option for those with less mobility, as quadriplegic Bob Melia, who advised the UCF team, points out:

You have no idea what it is like to want to do something as simple as scratching your nose and have to rely on someone else to do it for you. I see this device as someday giving people more freedom to do a lot more things, from getting their own bowl of cereal in the morning to scratching their nose anytime they want.

Behal presented his findings at the 2010 International Conference on Robotics and Automation in Anchorage, Alaska.

 

Join the discussion

Conversation powered by Livefyre

Don't Miss
Hot Products
Trending on CNET