X

Apple's new iOS 15 features would be a perfect fit for AR glasses

Apple doesn't have a headset yet, but the latest software for iPhones, iPads and Macs keeps building the groundwork

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR, gaming, metaverse technologies, wearable tech, tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
6 min read
wwdc-2021-apple-557-general-tim-cook.png

How long will it be until this is a reality? Apple's WWDC intro winked at telepresence. It's not a joke, though.

Screenshot/Apple

Another year, another no-show for Apple AR glasses at the annual WWDC conference. At the company's second all-virtual version of the developer conference, there wasn't a peep about Apple's long-expected VR and AR headsets. No new big AR push, either. You could have walked away from the WWDC keynote thinking Apple wasn't emphasizing AR much at all. At least, so far: 2021 isn't over yet.

In fact, look more closely, and there are already a lot of puzzle pieces scattered all over the place. Apple's ARkit and RealityKit developer tools added some deeper features to manage more objects and bigger virtual landscapes overlaid onto the real world. Core apps started to get some AR hooks, too: Apple Maps is adding AR directions, like Google Maps already does. Also, much like Google Lens, Apple is introducing ways to read and search for text in photos, or via the Camera app.

Spatial audio, sharing and FaceTime: Beginnings of telepresence?

Tim Cook stepped onto the virtual stage at WWDC to face an audience of Memoji, Apple's AR avatars that have already been around for three years. It was meant to represent the feeling of all of us watching from home, perhaps. I kept looking at it and thinking of the future of telepresence.

Apple doesn't have its own social AR communication apps yet, but others do. Spatial, a company that has its own VR and AR apps on headsets and phones, is one example. Many of these lean on spatial audio to create a sense of presence and direct attention. Facebook considers spatial audio to be a cornerstone of how people will communicate with AR glasses.

And on iOS 15, Apple is adding spatial audio to FaceTime calls. If you haven't played with VR or AR social apps, spatial audio in FaceTime might seem like overkill. I haven't tried it on iOS 15 yet, but I have a feeling this matters a lot more than it seems to. In larger groups, it could help create a map of who's where. On a FaceTime grid, that might not matter much. But in an eventual room with hovering FaceTime holograms, like Microsoft is already playing with in Mesh on Hololens, it can be really key. Spatial audio's getting knitted more into Apple's ARKit capabilities, and it makes me wonder what's next.

The added sharing tools in FaceTime, while also feeling like a late catch-up to Zoom, seem very important. If Apple is developing an OS for glasses that will allow people to share worlds together, then Apple is going to need to figure out how people can connect and show apps, content, and more with each other instantly. Evolving its own Facetime tools seems like the very first step.

wwdc-2021-apple-112-ios-15-live-text.png

Apple's Live Text scans with your iPhone camera, like Google Lens. But that's a tool that AR glasses could take advantage of.

Screenshot/Apple

Live Text and Maps: AR as a helpful tool

Stop me if you've heard this before: Augmented reality can be used to assist people. Google has made assistive AR a focus for several years, and both Google Maps and Google Lens use AR in different ways to show pop-up directions or to analyze text and objects in the real world to overlay information onto them.

That's been the dream goal for AR glasses since Google Glass eight years ago. Apple's introduction of these types of features on iOS 15 indicates that it's ready to treat AR as more than a magical experience or a way to shop for things. There are already plenty of useful AR-enabled apps on the App Store, but Apple's own OS hasn't integrated them much. Both Maps and Live Text seem like the beginnings of that integration.

Object Capture: A preview of how Apple will evolve 3D scanning?

A pro tool announced at WWDC allows developers to make high-resolution 3D files out of real-world objects. The iPhone and iPad can already do surprisingly capable 3D scanning through apps and hardware features like lidar, but the quality of scans can be unreliable. Apple never made its own 3D capture software before, but Object Capture is a start.

Unlike many existing 3D scanning tools, which map image capture data onto 3D depth maps, Object Capture turns a bunch of photographs (captured via iPhone or iPad, or otherwise) into high-res 3D files. The processing part happens on a Mac, which feels like a disconnect at first. Apple's iOS hardware -- the M1 iPad Pro especially -- seem like they have plenty of processing power for tasks like these. 

The Mac is being leaned on by Apple as a 3D processing tool, but it could also be a stepping-stone to exploring how Apple will approach 3D object capture on future, more powerful iPhones and iPads. 

The Object Capture tool is being used right now for an extremely practical purpose: getting AR-enabled e-commerce on its feet in the next year. Virtual shopping experiences are already proving effective experiments through the pandemic, and it looks like Apple is planning for Object Capture to bolster libraries of 3D things for companies like Etsy, which is planning an expansion of its 3D shopping inventory in the fall, and Wayfair, which is making its own scanning app using Apple's toolkit for manufacturers selling through its store.

But at some point, 3D capture is going to be for everyday people, too: not just to share things, but to build objects and worlds that can live in AR. Apple may not be ready to lay all those pieces out yet on its hardware, but Object Capture brings Macs into the AR development fold.

app-clip-code

Apple's App Clip Codes, announced last year, are a part of Apple's ongoing real-world layering of AR onto real things.

Screenshot by Jason Cipriani/CNET

Apple's real-world AR layer is slowly evolving

To have AR glasses that work in the real world, you need a real world that's mapped for AR. Apple's been remaking its world map gradually, using lidar-equipped cars, over the past few years. A number of cities are becoming capable of real world AR that can be tagged to physical locations. For Apple, these cities are all US-based for now, with London being the first outside the US in the fall. Apple's latest ARKit tools need that location-based AR layer of data in order to make virtual art appear for multiperson experiences, or for things like AR-based directions that pop up in the next version of Maps.

Apple's also aspiring further into tagging real objects with QR code-like Apple tags called App Clip Codes that, when scanned, will bring up AR effects that can map to the object being scanned, or to nearby things. The tags can launch Apple's mini-app App Clips, announced last year with iOS 14, but in AR-capable formats. Apple started working on this idea last year, but it looks like progress in real-world tagged objects has been slow. Maybe we'll see products (Apple's own would make sense, or HomeKit accessories) start getting these App Clip Codes. 

Lots of other companies are also pursuing real-world-based, multiperson AR: Snapchat, Niantic, Google, Microsoft and Facebook, for starters. How Apple's progress compares against those competitors could determine how quickly Apple releases an advanced pair of AR glasses that are designed to be worn all the time. Until then, Apple's expected upcoming VR/AR hybrid headset could bridge the gap for developers by being less reliant on real-world outdoor locations.

Is a pro headset coming next?

Apple could have its own AR/VR hardware next year. But odds are strong the company will need to start discussing the new software and its significantly different OS much sooner than that, maybe a year ahead, based on estimations of how Apple has announced previous new platforms. These new AR tools are helping create new sharing, capture and assistive dimensions that could lead right into where Apple's headsets, which will emphasize communication, collaboration and showing virtual things in the real world, could go next.

Apple's late arrival to the AR/VR headset scene wouldn't be anything new. In fact, Apple tends to make late appearances to new tech (the Apple Watch, for instance, or the iPhone or AirPods). While many companies like Facebook, Snapchat and Microsoft are sharing their emerging ideas in more experimental states, Apple may be waiting to more fully bake its first headset effort. Or, continue doing what it's already doing: Evolving the AR software right out in the open, feature by feature.