Wheeled 'Cyclops' bot offers insight into blind
Caltech researchers are working on a remote-controlled rover with an onboard camera that could deliver some very useful data about how the visually impaired see the world.
At first glance, Cyclops resembles a bot, and it's hard to imagine what connection it could have to restoring sight. But dig a little deeper and it starts to make sense that a remote-controlled robot with an onboard camera could deliver some very useful data.
The digital camera can emulate left-to-right and up-and-down head movements. The idea is that as artificial vision prostheses increasingly become a reality, scientists could use the mobile robotic platform to mimic those devices--and more importantly, to get a better sense of how well they work for people who wear them.
The researchers might do that by asking the robot outfitted with an artificial vision aid to navigate obstacles in a corridor or follow a black line down a white-tiled hallway to see if it can find--and enter--a darkened doorway. All the while, they could try out different pixel arrays (say 50 pixels vs. 16 pixels), as well as image filters (for factors such as contrast, brightness enhancement, and grayscale equalization) to venture an educated guess as to what settings maximize a subject's sight.
But "we're not quite at that stage yet," researcher Wolfgang Fink says of such independent maneuvering. Fink is a visiting associate in physics at Caltech in Pasadena, Calif., and founder of the school's Visual and Autonomous Exploration Systems Research Laboratory, where where he and Caltech visiting scientist Mark Tarbell are collaborating on Cyclops with the support of a grant from the National Science Foundation.
The pair designed and built the body of the battery-operated rover using off-the-shelf parts, then furnished it with an onboard computing platform that allows for processing and manipulating images in real time using software they created called "Artificial Vision Support System."
Cyclops, so named because it's monocular, is about 12 inches wide by 12 inches long and 10 inches tall (the camera can be mounted on a mast to make Cyclops the height of an average person). It weighs about 15 pounds, Fink estimates, and can move at an "expedited walking speed" of about 2 to 3 feet per second.
For now, the platform itself is controlled remotely, via a joystick, and can be operated through a wireless Internet connection. "We have the image-processing algorithms running locally on the robot's platform," Fink says, "but we have to get it to the point where it has complete control of its own responses."
Once that's done, he adds, "we can run many, many tests without bothering the blind prosthesis carriers."
No fancy camera needed
The Cyclops camera is basic--an inexpensive consumer FireWire model. And that does the job just fine.
"Current retinal implants have anywhere from 16 to 50-plus pixels, whereas any cheap camera has a quarter million or more," explains Fink, who in addition to his work at Caltech is a professor of microelectronics at the University of Arizona. "Any camera will by far surpass the resolution of an implant." The only thing that's really important is that the camera produces images at a good clip--say, 30 frames per second.
Scientists worldwide--including Fink and Tarbell, who participated in the U.S. Department of Energy's Artificial Retina Project--are working on electronic eye implants and other systems that let people with retinitis pigmentosa and age-related macular degeneration recognize objects and navigate through their environments unassisted.
Retinal implants use miniature cameras to capture images, which are then processed and passed along to an electrode array in an implanted silicon chip.
In aat MIT, users would wear special glasses fitted with a small camera that relays image data to a titanium-encased chip mounted on the outside surface of the eyeball. The chip would then fire an electrode array under the retina to stimulate the optic nerve. The glasses would also wirelessly transmit power to coils surrounding the eyeball.
The DOE estimates that less than 40 people around the world have been implanted with artificial retinas. They include a 50-year-old New York woman with a progressive blinding disease who in June was implanted with an experimental device made by Sylmar, Calif.-based Second Sight. The surgery, which was conducted by a team from York-Presbyterian Hospital/Columbia University Medical Center, has partially restored the woman's vision, according to the hospitals.
But designing implants and other visual enhancements poses unique design challenges. Chief among them: how can you measure the enhancements if you can't see what the person wearing them sees?
Next best thing
The Cyclops system poses an alternative to repeatedly testing the few people implanted with artificial retinas or having subjects with healthy retinas gauge low-resolution images on a computer monitor or head-mounted display (that approach produces a less realistic picture, according to Fink).
"A sighted person's objectivity is impaired," he says. "They may not be able to get to the level of what a blind person truly experiences...The next best thing to actually using a blind person is having a machine where you can dictate what the visual input is for navigation."
Fink and Tarbell--who detail their work in an upcoming issue of the journal Computer Methods and Programs in Biomedicine--have filed a provisional patent on the Cyclops technology on behalf of Caltech. The pair has not yet used Cyclops to get feedback from someone with a real implant, but hope to do so in the near future.