Apple Lays Out More Pieces of Its Future AR Headset Puzzle

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR | Gaming | Metaverse technologies | Wearable tech | Tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
4 min read
A lunar lander 3D model, appearing hovering on an iPad screen.

Apple's existing augmented reality apps already do a lot. How much further can they go ahead of an expected headset?


What's happening

Apple didn't announce a virtual or augmented reality headset at its WWDC developer conference, but some new software tools were announced that push AR and social connection further.

Why it matters

Apple's expected headset will likely work with Macs, iPads and iPhones, building off of ideas already being developed on the company's software. Meanwhile, many other companies, including Google and Qualcomm, are advancing their AR ambitions.

What's next

A VR headset may not come until 2023, although it's possible one could be announced sooner.

Anyone following Apple's WWDC developer conference last week for news on its long-expected AR/VR headset was bound to be disappointed. The event had news about Macs, iPads, iPhones, Apple Watch, the smart home and even cars, but AR was barely mentioned.

After reports of delays, Apple may not end up releasing this headset until 2023. That could mean another year of waiting… or it could mean an event later this year previewing what's to come. Right now we're left with a lot of unknowns, but that doesn't mean there weren't clues and new software that seem like they'll be very useful for a headset to come. 

Apple already has a well-developed set of AR tools for iPhones and iPads, and depth-sensing lidar scanners that can do the sorts of real world "meshing" that VR and AR headsets need to overlay virtual objects convincingly. There is a set of tools that recognizes text and objects from camera feeds, much like Google Lens. The Apple Watch even has some accessibility-focused gesture recognition

What WWDC 2022 showed off were a handful of refinements that, the more I think about them, seem aimed at laying down additional groundwork before a headset arrives.


SharePlay arriving in Messages makes it a connected social framework that could be useful in a VR headset.


FaceTime and Messages: Tools for Apple's metaverse?

As companies have pivoted to metaverse messaging over the last year, it's generally been code for rethinking massively social cross-platform interactions. "Social" is a weird thing for Apple, which isn't a social network-focused company like Meta, Snap or Niantic.

Apple does have FaceTime and Messages, which form a proprietary network across Apple's devices that could be the glue to connect with people on headsets. Apple's SharePlay framework, which was introduced in 2021's iOS 15, is trying to make collaboration and watching or playing content with others feel more instant. SharePlay's reach has expanded in iOS 16. A lot of it looks like the sort of adhesive that Apple's metaverse ambitions need.

Apple already has Memoji avatars, but Apple has also been increasingly adding sharing tools that link apps and collaborative content through Messages and FaceTime. Those added features in iOS 16, iPadOS 16 and MacOS Ventura could make sharing things easier, but on a headset they could be essential shortcuts for connecting quickly with others. Meta's VR headsets lean on friends and party-based connections via Messenger for social; Apple could be heading down the same path. SharePlay coming to Game Center, Apple's overlooked social gaming hub, seems like a similarly useful tool for future cross-play experiences.

A room gets phone-scanned, with furniture and walls being outlined in glowing lines.

RoomPlan scans a large room, plus its furniture, creating a 3D scan.


Could RoomPlan be a stepping-stone to in-room mixed reality?

Apple announced one stealthy new tool in its upcoming ARKit 6: a room-scanning technology called RoomPlan. At first glance, the feature looks like Apple's own version of room-scanning tech using lidar, similar to what other companies have been developing. The tool recognizes furniture, walls, windows and other room elements, and also quickly creates a 3D scan of a space for things like home improvement or construction. 

Or, maybe, it could enable new forms of mixed reality. "Don't forget about using people's surroundings as a canvas for your experiences," says the WWDC developer video detailing RoomPlan, adding you can "even bring people's spaces into the game you're building." While a depth map with lidar could overlay AR objects already, what this new tech could enable is bringing a room into VR and overlaying virtual things into that layout. I saw this idea a while ago in VR headsets that used cameras to scan environments and brought them into VR, creating a sort of mixed reality feeling, and it could well be the sort of mixed reality that Apple's expected camera-studded VR headset starts enabling.

A virtual pirate ship sits on a real pond, seen on a phone screen.

Background video quality for AR will get better in ARKit 6.


4K video for AR sounds like a path to headsets

Another new ARKit 6 feature seems notable: AR effects will now work with 4K video input. That's a weird feature at first glance for phones, which have screens too small to appreciate 4K AR. For capturing video with AR effects overlaid, it could be useful. But increasing the video quality for AR would be extremely helpful in VR headsets that use onboard cameras to combine VR with video feed of the outside world, a technique called passthrough mixed reality.

VR headsets that use passthrough mixed reality, like the hardware Apple is expected to eventually release, rely on a high-quality video feed to make overlaid virtual objects look realistic. On the Oculus Quest 2, that video feed is grainy and in black and white. On the professional-grade Varjo XR-3, where I've had a chance to try a more advanced passthrough mixed reality, it's in color and much higher-res. 

The new ARKit features also look to be faster at recognizing room details for quick AR overlays and motion-capture tracking. Both would also be helpful and necessary in a headset with AR features, too.

A 3D map view of an airport.

Apple's adding more cities to its upgraded, location-based AR-ready set of destinations in Maps.


Apple's still expanding cities where location-specific AR could work

A number of companies have recently expanded their mapping initiatives to work with AR so that future glasses could recognize "permanent" AR objects in real-world places. Google is expanding AR to work across many of its Street View-enabled maps, and Niantic's building a crowdsourced map of playable areas for AR games. Snap has been scanning cities with lidar. Apple has scanned cities with lidar as well, but only a handful. More are being added this year, but that means there are only certain places where location-specific AR will work reliably with Apple's AR toolkits. Apple is not expected to have any everyday-wearable AR glasses for a long while, and that makes sense: Besides concerns about battery life, cost or safety, AR glasses will need a worldwide mapping framework that is only half-built right now.

Still a lot we don't know

Despite many reports of a headset being imminent (or arriving in 2023), we still know nothing definitive about what Apple has planned. These little bits and pieces of speculation from Apple's new software tools are hardly proof of anything… but they show many of the necessary pieces are assembling in plain sight.