iPhone 11 Deep Fusion camera is now available to try in iOS 13.2 public beta
Deep Fusion is an entirely new technique that the iPhone will use to improve detail and reduce image noise in the photos you take.
Patrick HollandManaging Editor
Patrick Holland has been a phone reviewer for CNET since 2016. He is a former theater director who occasionally makes short films. Patrick has an eye for photography and a passion for everything mobile. He is a colorful raconteur who will guide you through the ever-changing, fast-paced world of phones, especially the iPhone and iOS. He used to co-host CNET's I'm So Obsessed podcast and interviewed guests like Jeff Goldblum, Alfre Woodard, Stephen Merchant, Sam Jay, Edgar Wright and Roy Wood Jr.
Patrick's play The Cowboy is included in the Best American Short Plays 2011-12 anthology. He co-wrote and starred in the short film Baden Krunk that won the Best Wisconsin Short Film award at the Milwaukee Short Film Festival.
Deep Fusion, Apple's new image processing technique for the iPhone 11, 11 Pro and 11 Pro Max, is now available as part of both the developers beta and public beta of iOS 13.2. Deep Fusion will only work with iOS devices running an A13 Bionic processor, which is currently only the newest iPhones.
When the iPhone 11 and 11 Pro were first announced in September,
showed off the new ultrawide-angle camera, Night Mode and an improved selfie camera, all of which represented a significant step forward for iPhone photography and videos. And now that they're in the wild, we've tested the new iPhone cameras and can confirm their improvements as well as the absolute enjoyment we feel using that ultrawide-angle camera. But there's one camera feature that Apple teased at its fall iPhone event that no one has gotten to try: Deep Fusion.
While it sounds like the name of an acid jazz band, Apple claims the brand-new photo processing technique will make your pictures pop with detail while keeping the amount of image noise relatively low. The best way to think of Deep Fusion is that you're not meant to. Apple wants you to rely on this new technology but not think too much about it. There's no button to turn it on or off, or really any indication that you're even using the mode.
Watch this: We compare the cameras on the iPhone 11 Pro and iPhone XS
Right now, anytime you take a photo on an iPhone 11, 11 Pro or 11 Pro Max, the default mode is Smart HDR, which takes a series of images before and after your shot and blends them together to improve the dynamic range and detail. If the environment is too dark, the camera switches automatically into Night Mode to improve brightness and reduce image noise. With Deep Fusion, anytime you take a photo in medium to low light conditions, like indoors, the camera will switch automatically into the mode to lower image noise and optimize detail. Deep Fusion, unlike Smart HDR, works at the pixel level. If you're using the "telephoto" lens on the iPhone 11 Pro or 11 Pro Max, the camera will drop into Deep Fusion pretty much anytime you're not in the brightest light.
This means the iPhone 11, 11 Pro and 11 Pro Max have optimized modes for bright light, low light and now medium light. And I'd argue that most people's photos are taken in medium- to low-light situations like indoors. The impact that Deep Fusion will have on your photos is enormous. It's like Apple changed the recipe of Coke.
At the iPhone event, Apple's Phil Schiller described Deep Fusion as "computational photography mad science." And when you hear how it works, you'll likely agree.
Essentially anytime you go to take a photo, the camera is capturing multiple images. Again, Smart HDR does something similar. The iPhone takes a reference photo that's meant to stop motion blur as much as possible. Next, it combines three standard exposures and one long exposure into a single "synthetic long" photo. Deep Fusion then breaks down the reference image and synthetic long photo into multiple regions identifying skies, walls, textures and fine details (like hair). Next, the software does a pixel-by-pixel analysis of the two photos -- that's 24 million pixels in total. The results of that analysis are used to determine which pixels to use and optimize in building a final image.
Apple says that the entire process takes a second or so to happen. But to allow to you to continue snapping shots, all of the information is captured and processed when your iPhone's A13 processor has a chance. The idea is that you won't be waiting on Deep Fusion before taking the next photo.
I should note that Deep Fusion will only be available on the iPhone 11, 11 Pro and 11 Pro Max because it needs the A13 Bionic processor to work. I'm excited to try it out and share the results once the developer's version is out.
Watch this: iPhone 11: 3 phones, reviewed. Which do you choose?
Compare photos from the iPhone 11 Pro against last year's iPhone XS