Best TVs 'She-Hulk' Review Up to $1,000 Off Samsung Phones Best Streaming TV Shows Home Bistro Review 8 Great Exercises Amazon Back-to-School Sale Best Phones Under $500
Want CNET to notify you of price drops and the latest stories?
No, thank you
Accept

Google Begins Publicly Testing Its AR Glasses

The camera-enabled assistive glasses won't be able to take photos or video, though.

Google glasses, disassembled and laid out on a table
Google's AR prototype glasses are entering public testing.
Google

A decade after Google Glass, Google is getting back to testing smart glasses in public again. The company announced its own smart glasses initiative earlier this year at Google's I/O developer conference, a project that's aimed at assistance rather than entertainment. Google's now starting to publicly test those smart glasses, the company announced today, beginning with dozens of pairs in field use and ramping up to several hundred by the end of the year.

Google's glasses are AR of a sort, relying on audio assistance that can use built-in cameras to recognize objects in an environment through AI, similar to how Google Lens can recognize objects and text with phone cameras. The glasses will not, however, be able to take photos or videos. Google's limiting those features on its field-tested glasses, focusing entirely on how the glasses can train their AI to recognize the world better.

The glasses, based on glimpses Google has shown in videos and photos, look nearly normal. But, unlike Meta's publicly available and normal-looking Ray-Ban Stories glasses, which are designed mainly for taking photographs and listening to music, Google's focused utility and assistive uses for its smart glasses right now: the specific early test cases Google lists at the moment are translation, transcription, visual search and navigation that will work with heads-up overlays similar to how Google Maps uses heads-up AR directions on phones.

Google's AR glasses prototype testers are prohibited from using the glasses "in schools, government buildings, health care locations, places of worship, social service locations, areas meant for children (e.g., schools and playgrounds), emergency response locations, rallies or protests and other similar places," or while driving or playing sports. Google hasn't revealed where in the US, specifically, these glasses will be tested.

According to Google, "an LED indicator will turn on if image data will be saved for analysis and debugging. If a bystander desires, they can ask the tester to delete the image data and it will be removed from all logs." The glasses don't take photos or videos, but use image data for its assistive AI. Google promises that "the image data is deleted, except if the image data will be used for analysis and debugging. In that case, the image data is first scrubbed for sensitive content, including faces and license plates. Then it is stored on a secure server, with limited access by a small number of Googlers for analysis and debugging. After 30 days, it is deleted." 

Field-testing for future smart glasses is an increasing trend, it seems. Meta started testing prototype depth-sensing camera arrays on a pair of glasses called Project Aria two years ago, focusing on how smart sensor-filled glasses could be used responsibly in public places. 

Google already had its own large-scale smart glasses test nearly a decade ago when it launched Google Glass, a device which sparked many of the first conversations about public camera use and privacy with AR headsets and glasses. Google's new project looks to be on a far smaller and more focused scale right now, and hasn't announced plans for the glasses to be a commercially available product yet.