CNET también está disponible en español.

Ir a español

Don't show this again

Wearable Tech

Audio isn't a key part of Google AR yet, but it should be

Google AR could be exploring immersive audio next.

cnet-apple-airpods-2017

Google is thinking about smarter audio.

CNET

Google runs a lot of key services through voice and sound: Assistant, in particular. But immersive audio, the type that the Bose Frames audio glasses have promised, hasn't become integrated into Google's AR plans yet. But the company's thinking about it, as a conversation with CNET at Google's I/O developer conference confirmed.

"When we talk about immersion, we're sometimes too focused on the camera," Aparna Chennapragada, head of Google Lens and AR, says, bringing up some advantages audio already has. "I think the podcast is a very clear example... AirPods and podcasts." Chennapragada also specifically brings up the immersive audio app Detour, acquired by Bose in 2018.

"Ideas like that, a key part is location specific audio: you can think about museum tours, etc," Chennapragada says, but suggests Google is still thinking it out: "Starting to figure out exactly how they intersect, there's a lot of user experience problems and challenges we have to work through as well, along with content. But I think the premise is sound there."

Qualcomm's reference design for a new range of Google Assistant-ready headphones isn't incorporating any spatial or position-aware augmented audio possibilities yet, but the idea of smarter audio, particularly as a way to help with assistance and attention without distraction, seems like a key step. Spatial audio for assistance in apps like Microsoft Soundscape, or the immersive audio experiments Bose has been conducting with Frames point at a lot of amazing possibilities.

Now playing: Watch this: Google Search gets AR, and Google Lens wants to be your...
4:19

A key advantage to immersive spatial audio would be not having to stare at a screen, of course. While we're already probably keeping earbuds in our ears too much, it's less of a distraction issue than being heads-down in a device.

Google's current focus on assistive AR is extremely vision-based, and all the experiences I tried required looking at a phone screen. But that can become a problem for AR apps like Google Maps, which don't want to distract. AR Maps UX Designer Rachel Inman also confirmed they're interested in possibilities for audio to bridge the gap in future AR experiences.

Maybe that interest could mean something in the works for next year's Google I/O.