Apple doesn't have an AR headset yet, but its AR toolkit is paving the way

Reality Files, shared worlds, eye-tracking, virtual puppets: Apple's reality distortion field is accelerating.

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR, gaming, metaverse technologies, wearable tech, tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
6 min read
WWDC 2019 attendees play the Swift Strike AR game.
Stephen Shankland/CNET

I chased a giant ball that wasn't really there with a friend. I watched someone puppet a hovering sun with their face, and speak through its mouth. I sat through sessions where Apple employees discussed the importance of Reality Files, which could store sharable 3D experiences that would unfold on command.

Apple didn't discuss augmented reality for all that much time on stage at WWDC 2019, but it was around in a lot of places. And what Apple has built this time around is stranger and more fascinating than what was summarized in those brief moments. I tried to wrap my head around as much as I could during a few days in San Jose.

After seeing Google also discuss AR just a month earlier at Google I/O, I noticed some key differences. Google largely backed away from AR as a vehicle for wonder and magic, choosing a path hoping for both utility and cross-platform support: virtual animals popping into the world via Google Search, helpful reality-navigating shortcuts in Google Maps, or camera tools that highlight popular dishes on real restaurant menus.

Apple chose to emphasize building AR's graphic chops even further, to a point where, perhaps, reality and AR become indistinguishable, with a new set of AR creation tools that looks a lot like the start of a creative suite for everyday people. These might be essential pieces falling in place for an AR/VR headset expected as soon as next year, to compete with hardware like Microsoft HoloLens, Magic Leap and a growing range of lower-cost phone-compatible headsets.

But it also looks, at this point, like Apple's wild new realities are made for Apple devices and software only.

Watch this: Apple’s new two-player AR arcade game at WWDC is crazy

Reality is blending

To Apple, AR is still very much about increasing realism: It showcased a lot of graphics tricks using ARKit 3, which leans on the power of recent iPhone and iPad hardware, and RealityKit, which is a set of 3D graphics tools that are looking to make whole AR scenes work more realistically.

The effects I saw can start to become almost overwhelming, even if they're not always perfect. There's occlusion, which can recognize people and make sure virtual objects stay properly in front of or behind them in real time.

Motion blur and more advanced shadows, as shown in demos, made things look even more real in photos and videos. A depth-of-field effect can notice what's in focus on the camera and blur out virtual things to make them match the same focus. Virtual things can even have overlaid grain to match the noise in a camera's video.

I've started feeling that AR things on iOS, even last year, have begun looking so real that they can casually fool someone online. These new effects feel like they're closing the gap even more. This is still happening on your iPhone or iPad, not on a pair of 3D glasses. But how many steps are left before that's the inevitable final result?

Shared worlds: What every AR platform is aiming for

Imagine hundreds of people running around in a universe full of magic things they can all see and share together, a metaverse that's layered on top of this reality. Pokemon Go began that dream, and Microsoft's Minecraft Earth will take it a step further by anchoring objects into the real world, and layering real people into the virtual world. Apple explored multiplayer AR last year, but that may have been more of a test run: The updates to ARKit 3 this year, according to Apple, unlock the multiplayer possibilities that seemed to be nearly there a year ago.

What I found most fascinating was that two people could start mapping their own spaces out, but when those spaces started to overlap, suddenly both people would suddenly find their maps shared and expanded. ARKit 3 isn't quite meshing an entire world to recognize furniture and chairs just yet like Microsoft HoloLens or Magic Leap can, because Apple's rear cameras don't have the same advanced depth-sensing cameras yet. But it can recognize floors, ceilings, walls, tables and now doors and windows. For a lot of situations right now that might be good enough.

Next year's 2020 iPhone is rumored to have a time-of-flight camera that allows more advanced depth sensing, which could allow the phone to "mesh" the real world much like Microsoft HoloLens, Magic Leap or Google Tango phones did long ago. Add a headset with a 3D display system, and things could get into AR headset-land fast.


Wonderscope, a storytelling app (photo from 2018), is adding new AR puppetry tools using ARKit 3.

Wonderscope from Within

Virtual puppets, eye-tracking avatars

A small and fascinating new feature in ARKit 3 is it can activate front and rear cameras at the same time. According to Apple, this means the advanced depth-sensing TrueDepth camera can be used for things including eye tracking, while interacting with AR things projected into the real world.

Imagine you see a projected person in your room. You could look at them and talk, and they'd make eye contact and know your reactions, or emotions. It's the same stuff Microsoft has been promising in the HoloLens 2, or Magic Leap explored with its virtual person, Mica.

Within's AR storytelling app, Wonderscope, is already using this in an early build of its next app for the fall, but with facial puppetry. I saw a demo where a magic talking sun, floating in the room, could be controlled with your face on that iPhone, or while holding another iPhone across the room.

Samatha Storr, executive producer of original content for Within, says "we learned early on that kids have a more emotional experience when they engage with characters directly, make eye contact." In the app's future ARKit 3 updates, "kids can puppet characters, and reach their hand towards the screen to touch characters."

Apple's added motion capture to its AR tools, too. You could layer a virtual body over your own, or control something else using your body. People can be recognized in AR scenes, and folded into the 3D experience. It's unclear how well these tools could instantly apply to things like live performance, but there's a theme of puppetry and identity play that reminds me of what Instagram, Snapchat and others are already doing with face-mapping camera filters. (By the way, you don't really need a depth-mapping front-facing camera for some of these effects: Google's developed ways for similar face-tracking effects to work without a dedicated depth-tracking camera, too.)

Jonny Ahdout, director of development for Within, describes it as becoming theatrical, "almost like walking into a play." In future stories, he says, augmented reality puppeteering will become a bigger part of the experience. 

These front-and-rear-camera and face-tracking tools, combined with AR, make it start to feel like the iPhone/iPad is just becoming a narrow pane separating my reality from the other one. A headset with eye tracking... well, that's the next step. I can only hold an iPad or iPhone in the air for so long.

James Martin/CNET

Reality Files: Are we heading to an Apple-only reality?

While it looks like AR is starting to expand and become a territory that Microsoft, Magic Leap, Google, Apple and others start sharing across headsets and phones, there's also a concern of a format war. Apple, in particular, looks like it's focusing on iOS and the latest Apple-made ARM-based chips as the building blocks for its most advanced tools.

Last year, Apple's emphasis of 3D object support across iOS 12 involved the new USDZ file format, one that other 3D creators don't necessarily use. But Apple took another step into its own format territory with the Reality File, which is the file format Apple will use to save AR creations in its upcoming free Reality Composer app hitting iOS 13 in the fall.

Reality Files will only work in iOS, on ARKit-supported iPhones and iPads. I asked about support on ARCore-equipped Android phones, and Apple doesn't have any plans for that at this time. Reality Files will take advantage of Apple's own graphics tools and AR rendering techniques in ARKit. In that sense, if you're making virtual worlds in augmented reality with Reality Composer, you're making them for iOS and iPadOS.

Maybe this is the beginning of a pathway that parallels where Qualcomm has been heading with AR and VR, using new chipsets and USB-C to create small glasses that power off phones. Maybe Apple does the same thing, in a sense, with a 5G iPhone in 2020.

Do all these things indicate that a pair of mixed-reality glasses could be coming next year, as reports have suggested? It's still unknown.

Apple's showing it's very serious about AR in 2019. But its most advanced and compelling effects are pushing advanced hardware that only extends to last year's A12-equipped iPhones and iPads, and it seems like Apple's most magical AR is, like many Apple ecosystems, going to be destined for Apple devices only -- phone, tablet and maybe other forms, too.