X

Google Reveals Project Astra: An All-Seeing AI That Could Live In Your Glasses

The always-scanning AI assistant demo looked a step ahead of where Meta, Humane and Rabbit are right now.

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR | Gaming | Metaverse technologies | Wearable tech | Tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
2 min read
A large screen saying Project Astra at Google IO

Project Astra is an AI assistant that could live on your future glasses.

James Martin/CNET

AI and AR are going to dovetail. Meta has already talked about this, and at Google I/O on Tuesday a new AI initiative called Project Astra, which can continuously scan camera feeds to provide contextual understanding of the world around you, certainly looks like a model for where next-gen glasses and AI devices could be heading.

Google called Astra "the future of AI assistants," a universal AI agent that can "help in everyday life." The goal of the agent is to be quick and conversational, with little lag. Having used Meta's Ray-Ban glasses with onboard AI, I've already noticed that the response lag between asking what I'm looking at and getting a response can take several seconds. Google's Astra demo, which was performed with a phone continuously pointing a camera around a room, was nearly instant with its responses.

Watch this: Project Astra Revealed at Google I/O

Recent AI gadgets -- the Rabbit R1, Humane AI Pin and Meta's Ray-Bans -- can ping the world with cameras that onboard AI can then analyze and respond to, but the response times are generally really slow, and limited in function. In Google's Astra demo, things looked a lot snappier. 

Google even seemed to show the new AI running with a pair of glasses, a tease that suggested more wearable AI tech to come.

A phone looking at a computer monitor, interacting with an AI assistant with the camera

Google demonstrated Astra on a phone, and also on camera-enabled glasses.

Google

The person interacting with Astra moved from a phone to camera-enabled glasses, looking around and asking questions to Gemini with both hands free, like, "Where did I leave my glasses?" "On the corner of the the desk next to a red apple," Gemini answered. The glasses looked almost like the promised live-translation AR glasses Google announced but never released two years ago. Google Glass -- Google's first attempt at assistant smart visors -- was released 11 years ago.

Google is already working on developing a mixed reality platform with Samsung and Qualcomm, and -- who knows? -- maybe a pair of camera-enabled AI glasses, too. AI is clearly the missing link for future XR devices, something Meta's Mark Zuckerberg and Andrew Bosworth have said for years. Google looks ready to make moves in this space, too.