Esto también se puede leer en español.

Leer en español

Don't show this again

Sci-Tech Leer en español

Mind-controlled telepresence robots could restore mobility to the disabled

A brain-computer interface used in conjunction with telepresence robots could one day help the disabled find a measure of independence.

robot.jpg
A user who is disabled controls the robot from his hospital bed over 200 kilometres away. Alain Herzog/EPFL

It may not be able to do grocery shopping or hang out laundry to dry, but a project involving current telepresence technology could help people with limited mobility get around in the form of a robotic avatar.

A team of researchers at the Swiss Federal Institute of Technology's Defitech Foundation Chair in Brain-Machine Interface in Lausanne, Switzerland, is working on a brain-computer interface that could see disabled people using their thoughts to control telepresence robots from the comfort of their homes.

"We have been developing brain-computer interfaces for people who suffer different kinds of motor disabilities so that they can translate their mental intentions into commands for the robots," explained project leader Professor José del R. Millán.

The project, which has been underway for a year, has tested the interface with nine people who are disabled and 10 healthy people across Italy, Germany and Switzerland. First, the user has to train to communicate with the robot. A certain thought will light up an area of the brain. To turn the robot left and right, the user will have to think in a specific way. These electrical brain signals are picked up by a non-invasive cap fitted with electrodes.

After training how to use the cap, the users then controlled a telepresence robot on location in a laboratory in Switzerland while still in their homes, sometimes in different countries, in real-time.

The robot, still in its early stages, consists of a laptop on a wheeled frame. This has a camera that allows the user to see the environment around the robot, as well as a display that shows the user's face via Skype, letting the user have conversations with people at the robot's location.

Additionally, the robot is fitted with sensors that detect the proximity of objects in the room -- allowing it to avoid collisions on its own, without being micromanaged by the user.

So far, the laboratory is reporting a 100 percent success rate.

Users were capable of directing the robot from room to room since the first trial, said Professor Millán. "But then we went over to compare their performance against 10 people without any kind of motor disabilities, and we saw that their performance was essentially the same."

Each of the users who are disabled were able to easily control the telepresence robot with less than 10 days of training.

The project, part of the European Commission-funded Tools for Brain-Computer Interface project, is still in the testing phase, and it's yet to be made available to users. The team hopes that the technology will be made available to the public, but there are some issues that need to be overcome.

"We would like to see this technology at the user's site, not confined to the laboratory," Professor Millán said. "For this to happen, insurance companies will have to help finance these technologies."