X

Apple's Vision Pro Upgrade Path Should Start at WWDC

There's a lot that the months-old Vision Pro is missing. Here's my wish list.

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR | Gaming | Metaverse technologies | Wearable tech | Tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
6 min read
Apple Vision Pro floating in the air against a purple background with a faint apple logo

It's time for Apple's Vision Pro to get some overdue OS and app upgrades.

Numi Prasarn/Viva Tung/CNET

Even though it's only been available for four months, Apple's Vision Pro headset feels well overdue for some updates. That feeling may be because, despite the headset debuting this winter, Apple first introduced the Vision Pro a year ago. WWDC 2023 was the Vision Pro's big debut: last June's conference included a whole wave of tech specs, dozens of developer videos and an in-person demo for some attendees, myself included. The promise of Apple's bleeding-edge mixed reality headset was massive. 

A year later, four months after its early February debut, there are still a lot of unfulfilled opportunities as we head into Apple's 2024 WWDC developer conference next week. It's time for Apple to unleash the Vision Pro in several ways, and maybe that will finally happen with the likely announcement of VisionOS 2.

The Vision Pro isn't for everyone: Its price is high ($3,500 and up) and its eye- and hand-tracking language can be alienating for anyone raised on touchscreens and keyboards, for starters. It represents a big bunch of new ideas for Apple, even if some of those ideas aren't necessarily new for people who have been working in AR and VR for years. If Apple wants the Vision Pro to truly evolve beyond what it's currently functioning as -- an occasional second screen for Macs, an immersive personal cinema and an experimental platform for occasional immersive collaboration -- that moment needs to be now.

International availability

The Vision Pro is still only available in the US, even though several developers have bought Vision Pros and taken them overseas. Reports from Bloomberg's Mark Gurman indicate the Vision Pro should finally see the headset go on sale in other countries, with an announcement likely coming at the WWDC keynote. That makes sense since WWDC is for developers, and many developers still can't acquire the Vision Pro easily.

iPad Pro and keyboard case on a wooden table

The iPad Pro and Pencil should work with Vision Pro. When will that happen? (Or iPhones and Apple Watches?)

Numi Prasarn/CNET

Make it work with all the Apple devices

The Vision Pro is a standalone device that can be its own iOS-based spatial computer with a potent Apple M2 chipset. It works with AirPods and can be a second screen for a nearby Mac. The Vision Pro conspicuously lacks a direct connection with other Apple devices, specifically iPhones, iPads and Apple Watches.

Each of these three matters a lot more than Apple has been suggesting. Yes, the Vision Pro can play back 3D videos recorded on iPhone 15 Pros, but you can't use an iPhone to launch apps on Vision Pro or use the iPhone as a touchscreen or motion controller for Vision Pro interaction. Having direct connections via phones and tablets will also be a huge benefit for managing Vision Pro for others. For example, Meta's Quest app on phones and tablets is how I launch apps and view streams of what my kids or family are trying on the Quest headset. I can re-center their experience, or load up something I want them to see.

I also want phone notifications to carry over to Vision Pro as they already do on Meta's Quest 2 and 3. I want to take calls I might have otherwise missed, or check work messages via apps I might not have directly loaded into Apple's headset.

Apple's iPad line feels like a particularly key Vision Pro companion, yet it doesn't speak with Vision Pro at all right now. I could use the iPad screen like an interactive touchpad, or use an iPad's connected keyboard and trackpad with the Vision Pro... or what if I could draw with a Pencil on the iPad and see the art on Vision Pro, where I could tweak it further in 3D? Apple's already touting 3D creative workflows on the iPad Pro, and yet its key 3D creative hardware isn't part of that picture yet.

A Spatial Persona working with a whiteboard in an office

Collaboration with Apple's Spatial Personas lets you invite friends into your worlds now, with a surprising immersive quality. Now do it for more apps and other devices, too.

Screenshot by Scott Stein/CNET

Open up creative collaboration on headset and with other Apple devices

Microsoft once pitched the Hololens 2 headset as a device that would see hologram-like mixed reality and collaborate with phones and tablets that could also look at those same mixed reality experiences together in real time. Multiperson collaborative AR has been something Apple and Google have worked on for years, mainly on phones

Apple's Head of AR and Vision Pro, Mike Rockwell, told me several years ago that the big advantage of augmented reality on iPhones versus specialized headsets was the sheer number of people who could access these experiences. Millions of people already have AR and even lidar-enabled depth-sensing devices in their pockets, even if they're not on their heads.

Apple has opened up virtual collaboration on Vision Pro using its semi-realistic "Persona" avatars, freeing the Personas to float in your space while seeing or interacting with apps together. What about letting people on phones and iPads blend with Vision Pro-wearing Personas, too?

The Vision Pro represents Apple's biggest push into AR technologies, but why not have experiences dovetail between headset and iPhone and iPad, especially for collaboration on ideas or creative projects in 3D? Companies like Qualcomm and hardware startups like Campfire have already been showing ideas along these lines. I saw a demo years ago from a company called Spatial that still makes me think about what's possible. Apple's clearly able to make this happen on 3D creative apps, if it wanted to. Now's the time to make it happen and enable the Vision Pro to be more than just a tool for other Vision Pro owners.

Cartoon characters appearing with location names over a map of Singapore

Google's already putting AR into its Maps app on phones. But where is Apple's mixed reality version of Maps for Vision Pro?

Google

Unleash all the other Apple apps into mixed-reality

Apple needs to lead by example to encourage more developers to think boldly about the Vision Pro, and that needs to start with Apple's own core apps. Even months after the headset's debut, several Apple apps still haven't made the move to full Vision Pro modes. 

There are a few big omissions. Maps is begging for a Vision Pro mode, both to show off immersive 3D world views already in the Maps app and to allow for some location-based mixed reality possibilities between phones and headsets. Google already uses its Maps app as a discovery tool for location-based AR.

What about GarageBand? I've already played with some great musical creation apps in VR and AR, but GarageBand hasn't made the jump yet. Riffing on the piano, playing drums, editing tracks… it's an obvious fit.

iMovie should make the jump, too. All the Minority Report visions of editing timelines with pinches and swipes could literally become real in the headset. It would also be a way to explore editing movies on a far larger immersive canvas.

Apple Vision Pro AR/VR headset

There are a lot of cameras on the Vision Pro: enable them for more apps.

James Martin/CNET

Open up permissions to developers

Apple's Vision Pro headset is a playground of cameras and sensors inside and out, but they're not all accessible to third-party developers. In particular, current Vision Pro apps can't use the cameras to truly see the world around you. That means mixed reality creatures or characters, or AI tools, won't really know much more than the generic wire-frame mesh of your 3D room shape that gets captured with the headset's lidar sensor. 

While camera permissions also mean potential privacy concerns (will my home environment be seen by others, or where will that data be stored?), not granting these permissions means holding back Vision Pro's true wearable AI capabilities. Figuring out how these permissions are given and what else can be done to blend the real and virtual and infuse mixed reality with AI should start now.

GenAI makes a lot of sense on the headset

AR and AI should work well together. Meta's visions for AR glasses lean heavily on AI, and Meta's CTO Andrew Bosworth sees generative AI appearing on Quest headsets soon. If Apple is making big AI announcements at WWDC, those plans must include Vision Pro in a major way. In the last few months, I've already found I use Siri more on Vision Pro than I do with my phone or watch. I open apps, close apps and search for things with my voice. Could I create with AI in Vision Pro, too?

We'll know more soon

WWDC is near, and Apple is expected to announce its next version of VisionOS 2. We'll be there in Cupertino, and we'll report back on how many of our expectations were met for the future of Vision Pro over the next year.