One of the coolest things to be shown off at the TechCrunch50 conference might not ever become something any of us can use. It was a mythical technology demo from a company called Tonchidot Corporation, which showed off its "Sekai Camera" application. It uses both the camera on your phone and GPS to offer up a near real-time tag of what you're looking at.
The funny thing is the entire demo could have been a complete hoax. We never saw the service in action--just a video of it placed in the gadget-saturated Akihabara district of Tokyo. It identified things like restaurants, local shops, and even products with links to user reviews, ratings, and of course buying options.
If the technology is working, objects on the touch screen get tagged in near real time. Users can then interact with those objects, making use of their handsets' interface. In this case it was the iPhone, so users could manage what they're seeing into ordered lists and candy-colored floating tags that moved as they moved.
According to its creators, the technology does not pull as much information from the camera as it does from your location. The information gets piped over to Tonchidot's servers, then filtered into tags. It also uses a similar model to some of the location-based social networks seen on the iPhone, so users can leave little virtual "hobo codes" for one another around major cities. So say, for instance, you ate somewhere and didn't like it, you could visually tag it and leave your review. Others would then be able to see it when they use the application.
Things we still don't know about the technology include:
-Who will be serving the advertisements attached to local shops and products
-If it's limited to the iPhone or any device with a camera, GPS, and a fat data pipe
-What happens when things change in local areas, since the visual tags are based partially on things the technology recognizes
-When this would be available as something you'd get in the iPhone apps store