X

Apple's future AR pieces hid in the corners during a virtual WWDC

With rumors of Apple AR glasses on the horizon, the news from Apple's biggest developer event was surprisingly quiet.

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR, gaming, metaverse technologies, wearable tech, tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
5 min read
ipad-pro-ar

The iPad Pro's lidar is a major part of Apple's next-wave AR plans.

Scott Stein/CNET

Apple's making an AR headset. Or glasses. Or an AR/VR headset. Something like that, something coming in 2021, 2022... or 2023. Reports have been everywhere; the pieces seem to be coming into place.

And yet, at this year's video-only WWDC conference in a coronavirus-scrambled 2020, those plans weren't announced -- or even hinted at. Instead, there are pieces in place that seem to connect parts to a puzzle. And unanswered questions.

Apple's CEO, Tim Cook, has telegraphed intentions to make augmented reality a major part of the company's future over the past four years. With competition from Google, Facebook, Microsoft, Qualcomm and many others, Apple still has the pieces in play to make hardware that could make a tremendous splash: millions of iPhones, iPads, an ecosystem of products running Apple-made processors and an evolving AR graphics toolkit. And yet, as everyone's been stuck at home living virtual lives, augmented reality experiences were downplayed at this year's event.

Last year, Apple unveiled integrated AR tools to allow multiplayer experiences and build pieces of virtual worlds. This year, Apple's updated ARKit 4 tools in iOS 14 and iPadOS 14 seem to have fewer dynamic new pieces, even if they're playing key parts. Some other announcements seem to play roles, too. (The new ARKit 4 features require Apple A12 processors or newer.)

Here's what could factor in, however, if you look at the puzzle pieces: a depth-sensing iPad, spatially-aware AirPods, location-based markers tied to a more evolved reality-scanning Apple Maps, and more ways to link virtual worlds to real places.

arkit4-wwdc20.png

Depth-mapping a room, with scene understanding, via the iPad Pro lidar.

Apple (screenshot by Scott Stein/CNET)

The iPad Pro's depth sensing is key

This year's iPad Pro, released earlier this spring, has a unique lidar sensor that scans spaces and can create 3D maps. It's very likely to be the world-scanning feature that will be on future iPhones and eventually in Apple's AR headsets, too. Apple's new ARKit 4 toolkit for developers has a Depth API that will take greater advantage of the sensor and promises more accurate measurements. Already, developers are using lidar to scan homes and spaces and mesh out scans that could be used not just for augmented reality, but for saving models of places in formats like CAD. 

location-anchors-wwdc20.png

Apple Maps data lining up with a real-world spot where AR could appear.

Apple (screenshot by Scott Stein/CNET)

Real-world locations and AR, blending further

Much like Microsoft, Snapchat and Google, Apple is adding Location Anchors to its iOS 14 AR tools, but with some precision tools lining up GPS and Apple Maps data. These specific geolocation markers will make pinning virtual things in specific places easier. Microsoft's Minecraft Earth, released last year, already has location-specific anchors. Apple looks ready to move this idea forward more, for shared worlds and maybe even experiences that could be pinned to city maps. Combined with the multi-user possibilities AR already enables, this should lead to more shared, pinned-down things in reality, like location-specific art experiences. One interesting thing, though: Apple says its new geolocation anchors will only work in certain major US cities for now, since it relies on more advanced Apple Maps data to coordinate and fine-tune positioning.

Apple's new clip-like app snippets could be used to launch AR with a quick scan

A new iOS 14 feature called App Clips promises to bring up fast app snippets when NFC scanning or using QR codes. Imagine a tap-to-pay location which could offer more, like a menu of options, too. App Clips can also launch AR experiences, which could mean that an NFC tap or QR code scan could then launch an AR experience, without needing to download a full app. My mind leaps to stores that could use it to show items that aren't available in-person, or museums with enhanced exhibits -- or who knows what else? This expands on Apple's already-present AR shopping tools, which rely on a previously downloaded or loaded app or page. Now, those things could launch when scanned at real-world things.

lexy-apple-airpods-pro-7441

AirPods Pro can do spatial audio. That could lead to environmental audio AR.

CNET

Spatial audio on AirPods Pro looks like Apple's audio AR

Last year, I started thinking about audio as pretty key in augmented reality: Instead of lifting a phone or even glasses to see a virtual thing in your world, audio cues could be a casual way of bringing up environmental info with earbuds. It's a way that's a lot less intrusive, and could include an option to launch some kind of visual AR after. After all, we already wear headphones all the time and live in audio bubbles. Apple's AirPods have often seemed like testbeds for an immersive future

iOS 14 enables spatial audio in Apple's step-up AirPods Pro models, using motion-tracking to position audio depending on how your head moves. For now, it's meant for listening to surround sound on an iPhone or iPad, and Apple hasn't integrated spatial audio for AirPods Pro into ARKIt yet, but this could be applied to audio AR experiences, too. Combine with eventual glasses, and it makes perfect sense. Bose tried this with audio AR before shutting down the experiment this year, but Apple could pick up where Bose left off.

Watch this: iOS 14 hands-on preview: Trying out the developers' beta of the new iPhone OS

Apple AR can cast virtual video screens now

An in-the-weeds feature of ARKit 4 called "video textures" does something I've seen in AR headsets like Magic Leap: It can project video into AR. This can be used for floating TV screens, or to map moving video avatars onto 3D models. Right now, it may seem silly to use your iPhone or iPad to create a floating virtual TV screen in your living room, when the iPhone or iPad literally is a mini TV screen. But, in a pair of glasses, this idea doesn't seem silly at all.

The projected video avatars idea is fascinating, too. Right now, AR and VR don't do great jobs of showing people's real faces in virtual worlds; usually it feels more like living in a cartoon or puppet universe. Even in virtual Zoom-like conference apps like Spatial, avatars look like crude stretched-out approximations of real acquaintances. Maybe video-mapped avatars could be a step toward meeting with holographic friends in future AR FaceTime calls.

What does it all mean? 

If you were expecting something big from Apple in AR (or VR), it's not happening now. No Apple headset. No Apple Glasses.  Neither, alas, is the ability to plug existing AR headsets into Apple devices. But Apple's AR tools are getting very advanced, even without a headset. It remains as unclear as ever when actual Apple AR headsets will arrive, but the existing iPad Pro looks like it'll continue to double as Apple's AR dev kit for now.