MIT researchers are trying to get computers to correctly interpret hand signals used by crews aboard aircraft carriers so that robot planes can follow them.
As Northrop Grumman continues to develop its, which is aimed at carrier use, Yale Song and colleagues at MIT are working on a machine learning system that could allow autonomous planes to understand crew directions.
In its research presented in the journal ACM Transactions on Interactive Intelligent Systems, the team used a database of abstract representations of 24 gestures often employed by carrier personnel. They trained an algorithm to classify gestures, including posture and hand position, based on what it knew from the database.
As seen in the video below, the algorithm works with a single stereo camera. It analyzes each frame in a sequence and calculates the probability that the movement is part of a certain gesture.
It does that while keeping tabs on the probabilities for the whole sequence of gestures, and while recognizing gestures continuously.
When tested, the system was able to correctly identify gestures in the training database with 76 percent accuracy, according to MIT.
Song and colleagues are trying to improve the gesture-recognition system by getting it to consider hand and arm position separately, reducing its computational load.
Another aim is to have the system provide feedback about whether it understands the gestures it's considering.
That way, if you did give it the one-finger salute, you would get the appropriate robotic response.