X

I Edited Lightroom Photos on Apple Vision Pro Using My Eyes. It Works

Exclusive: Glances and finger taps let you easily use Adobe's image editing software. The iPad version was readily adaptable to Apple's "spatial computing" headset.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
6 min read
A screenshot of Adobe Lightroom photo editing in a virtual window shown with Apple Vision Pro

Lightroom runs on Apple Vision Pro, letting you edit and catalog photos by moving your eyes and tapping your fingers together.

Adobe

A few weeks ago, I was flying cross country with a big pile of digital photos I wanted to edit and sort with my preferred tool for the job, Adobe Lightroom. But the seat in front of me was so close I could barely open my laptop lid. Disappointed, I ended up solving some crossword puzzles and watching a lame movie.

But what if I'd had a virtual or augmented reality headset — the Apple Vision Pro, for example — that projected my photos onto a big screen only I could see?

This week, I got to try out that exact technology. I used Adobe's new Lightroom app for the Apple Vision Pro. I operated the whole thing just by looking at what I wanted and tapping my fingertips together.

And it works. In this exclusive first look at the app, I can say it took only a few minutes to figure out how to use the headset for standard Lightroom actions like tweaking a photo's exposure, applying some editing presets or gradually dehazing the sky.

Color me impressed. My experience helped convince me not only that Apple has done a good job figuring out an effective interface for what it calls "spatial computing," but also that developers should have a reasonably easy time bringing at least their iPad apps to the system.

And that bodes well for the long-term prospects of Apple's headset. The more productivity and fun the Apple Vision Pro and its successors offer, the better the chance they'll appeal to a sizable population, not just some narrow niche like Beat Saber fans.

For me, the most compelling possibility with the Apple Vision Pro is using the virtual and augmented reality headset to have a private workspace in a public area. Lightroom fits right into that idea. I'm not ashamed or embarrassed by my photos, but I don't exactly enjoy sharing them with everybody on a plane flight.

Lightroom support isn't enough to get me to buy an Apple Vision Pro — starting price: $3,499 —  but if I bought one eventually, I'd definitely use Lightroom on it.

For a broader view, check my CNET colleague Scott Stein's review of the Apple Vision Pro. He calls it the "best wearable display I've ever put on," with impressive technology but also a somewhat unfinished feel.

What Lightroom looks like on the Apple Vision Pro

When you launch Lightroom on the Vision Pro, a virtual window pops up with your photo catalog. It can occupy a huge part of your field of view, which is great. And you can have other windows off to one side or another if you want to multitask more easily than you can on an iPad.

Watch this: See Adobe Lightroom on the Apple Vision Pro

I found the screen to be very good on the Apple Vision Pro. I wasn't bothered by low resolution and attendant pixelation of photos. Colors were bright, and tonal variations of photos looked good, though I wasn't performing any sort of calibration tests, so take that with a grain of salt. I was able to use Lightroom's glorious new HDR mode, though I had to dial the headset's background image down to very dark to get sufficient brightness headroom.

Apple's foveated rendering technology, a power-saving measure that shows high-resolution imagery only for the part of your field of view where you're directly looking, worked well for me. I never noticed low-resolution rendering.

I used the headset for a bit less than an hour, and it hadn't been custom fitted for my head, so I can't comment much on weight and wearability issues. But I didn't think at all about the heft until I took the headset off and realized I hadn't noticed it.

How Lightroom works on the Apple Vision Pro

If you've used Lightroom on an iPad, you know what Lightroom on an Apple Vision Pro looks like. It's basically the same.

On an iPad, you can tap on a slider and move your finger back and forth to brighten or darken a photo's exposure, for example. That works the same way on an Apple Vision Pro, except instead of placing your finger on a screen, you direct your gaze at the control, and instead of tapping on the screen, you hold your thumb and index finger together.

Want to open a photo for editing? Look at it and tap your fingers together. Open up the effects editing panel? Look at the effects button and tap your fingers together. Apply a preset? Look at the preset button and tap your fingers together. Do you see a trend here?

I'd never used an Apple Vision Pro before, but it took me only a few moments to grasp this look-and-tap interaction. Double tapping your fingertips, for example to zoom in to a Lightroom photo, is the immediately obvious analog for double clicking on a mouse or double tapping on a trackpad.

A man wearing an Apple Vision Pro headset holds his hands in front of him, with index fingers pressed against thumbs to control the software

CNET writer Stephen Shankland tests Adobe Lightroom software running on Apple's Vision Pro headset.

Also intuitively easy for me was dragging: Look at one spot; tap your fingers together; move your hand sideways, up or down; move your fingers apart again. Tapping together fingers on both hands and then moving my hands apart worked well also to zoom in to a photo.

Some shortcomings

I was quickly able to be productive with Lightroom on the Apple Vision Pro, but it wasn't perfect.

I had some problems with eye tracking accuracy. Sometimes the headset couldn't figure out which control I was looking at, but I hope that will improve with better calibration and Apple hardware and software improvements.

I also sometimes found myself locating the control I wanted to use then looking away before I tapped my fingers. I think that was because on computers, I'm used to aiming my mouse and then looking elsewhere as I click. So I had to learn to move at a more methodical pace.

A bigger problem from my perspective is that I use Lightroom Classic, the version of the editing and cataloging software that stores photos on my local hard drive and has a number of advanced features I enjoy. The Apple Vision Pro app is in the non-Classic Lightroom family, a more stripped-down version that stores photos in the cloud.

Somebody like me could still use Lightroom Classic on a Mac and then use the Vision Pro as a big virtual monitor, though the interface might not be as slick.

And for editing photos on that airplane flight, internet access would be a problem for cloud-based Lightroom. Fortunately, you can get Lightroom to download a group of photos ahead of time, so as long as you planned ahead, you'd probably be OK.

Last, some features don't work, like merging different shots into a single HDR photo. And if you want to take advantage of the Vision Pro's ability to view a panoramic photo in all its wrap-around glory, you'll have to export it to Apple Photos. That's easy to do, but I'd like a more immersive option in the Lightroom app itself.

Also on Apple Vision Pro: Adobe Firefly AI, Fresco, Behance

Adobe also released three other apps for Apple's Vision Pro. Its Firefly app lets you create imagery with Adobe's generative AI tool (though that won't work without an internet connection). Fresco is a version of its sketching app. And the Behance app lets you use the online portfolio tool with a slight social networking flavor.

Adobe has a lot of other apps, of course, like Photoshop, Illustrator and Express. It started with these four because they work readily with Apple's headset, said Eric Snowden, Adobe's vice president of design.

In contrast, software like Photoshop or Illustrator requires a lot of precise control over the interface and the creative work and often lots of numeric input into dialog boxes.

"It's something we would want to rethink," Snowden said. "It's not that it couldn't work, but I think there's a less direct translation."

For me, Lightroom's sliders and buttons were a natural fit for the Vision Pro. Maybe someday I'll be wearing one to edit my photos on a cramped plane flight.