Co-founder Sergey Brin shows off Google's computerized glasses -- but they're only for Google I/O attendees -- i.e., developers -- who are on the "bleeding edge."
SAN FRANCISCO -- The first Project Glass products -- Google's network-enabled, computerized glasses -- are set to ship to a select group of enthusiasts early next year, co-founder Sergey Brin said today.
"This is not a consumer device," Brin told thousands in an enthusiastically cheering audience at the company's Google I/O show today here. "You have to want to be on the bleeding edge. That's what this is designed for."
The glasses will be available only to Google I/O attendees who are in the United States. The geographic restriction is for regulatory reasons, Brin said. (Different countries have different requirements for radio-frequency emissions.)
Google demonstrated the glasses with a dramatic live Google+ hangout involving four parachutists who jumped out of a blimp above San Francisco and landed on the roof of San Francisco's Moscone Center, where Google I/O was taking place. Each wore Project Glass glasses that broadcast what they saw. So did stunt bicyclists and climbers who rappelled down the side of the building. All joined a relieved Brin on stage to a standing, applauding crowd that clearly liked the show.
"You've seen demos that were slick and robust. This will be nothing like that," Brin said as he introduced the publicity stunt. "This could go wrong in about 500 different ways."
Setting up the stunt was tricky. The glasses have Wi-Fi and Bluetooth built in, and Google also tried mobile data networks, Brin said in an interview. But ordinary technology doesn't work well when people are dropping at 120 miles per hour.
"3G doesn't work. It cuts out over 1,000 feet," Brin said.
Therefore Google tried four different approaches. It hacked together its own "home-brewed" transmission technology and used some expensive military options, Brin said, and it turned out that all four worked during the demo.
The glasses, now as light as regular sunglasses, come with a touch panel on the side, a button on top to take photos and videos, and a transparent screen to show information. They perch just above a person's regular vision so they don't interfere with ordinary eye contact. Google believes they're better for capturing a first-person view of the world, such as spontaneous photographs people would miss if they had to take time to dig out a camera.
The ambition is much bolder, though: in effect, an augmented brain.
"Someday we would like to make this so fast that you don't feel like, if you have a question, you have to go seek the [answer]. We'd like it to be so fast that you just know it. We'd like to be able to empower people to know information very, very quickly," one of the project engineers said.
Google showed off with a video and photos what it considers an ideal use for Project Glass: baby pictures. Babies will look at faces, so it's a more natural interaction and makes it less difficult to catch that elusive early smile.
"The baby looks into the mom's eyes, they connect. While doing that, she can capture this moment without any distractions," said Isabelle Olsson, the Project Glass lead designer.