A new Google Glass technology could help find and identify people by the clothes they wear.
Partly funded by Google, the InSight system works with individuals' self-identification via smartphones and with Google Glass to analyze clothes, eyeglasses, and other items. A person's name can then be displayed on the Google Glass headset whenever you bump into that individual, according to an article published yesterday by New Scientist.
One of the goals is to help Google Glass wearers more easily find friends in airports, stadiums, and other crowded places. There's just one drawback, or benefit, depending on your perspective.
The "visual fingerprint" created by the system is based on what a person is currently wearing. Once the individual changes clothes or eyeglasses or other accessories, InSight can no longer identify that person. That means the fingerprint may be good for just a day or evening, but it also means the long-term privacy of the individual is protected.
How does it all work?
The fingerprint is created by a smartphone app that takes a series of pictures of a person. The app then creates a file known as a spatiogram that records the various colors and patterns of the person's clothes. That combination is used by Google Glass to identify the person.
Of course, people can already be identified using facial recognition systems. But InSight is designed for situations where people are far away or have their backs turned, so their faces can't be seen.
The system has fared well in early tests. Using 15 volunteers, InSight was able to identify people 93 percent of the time, even with their backs turned to the Google Glass wearer.
InSight was developed by Srihari Nelakuditi, associate professor of computer science and engineering at the University of South Carolina, along with three colleagues at Duke University. An abstract of the system describes its design and use in greater detail.