X

Apple patent points to potential iPhone 8 camera capabilities

The patent for focus stacking granted this week may add a bunch of interesting new features.

Lori Grunin Senior Editor / Advice
I've been reviewing hardware and software, devising testing methodology and handed out buying advice for what seems like forever; I'm currently absorbed by computers and gaming hardware, but previously spent many years concentrating on cameras. I've also volunteered with a cat rescue for over 15 years doing adoptions, designing marketing materials, managing volunteers and, of course, photographing cats.
Expertise Photography, PCs and laptops, gaming and gaming accessories
Lori Grunin
2 min read

For Panasonic's post-focus feature, the iteratively focused photos are saved as a video. This is an example of what the capture looks like in action.

Lori Grunin/CNET

While several media outlets focused on the TV-related patents from the gaggle Apple was granted this week, there was one that I think points to a camera update (or at least new features) for the iPhone 8. Originally published in December 2016, it covers "optimizing capture of focus stacks."

Doesn't sound sexy? Focus stacking would enable some really neat capabilities, including a significant improvement to the current faux bokeh mode on the iPhone 7 Plus, depth-map creation for augmented-reality applications, the ability to choose the focus area in photos after-the-fact, and the ability to create photos without bokeh -- sharp focus throughout the image, aka infinite depth of field.

In-camera focus stacking works similarly to HDR in that it's a form of computational photography , where a camera quickly shoots multiple photos and combines them to produce a desired effect. For focus stacking, the camera takes photos while iterating through focus areas at successive distances. Then it combines the stack of images using the sharpest pixel from each to create a completely sharp image.

As long as the camera has all that image data, there's tons it can do with it. For instance, it can let you select which focus area you like and merge them to preserve only that focal plane -- that is, the ability to choose focus after the shot. It can use the focus distance-to-subject from each shot to create a depth map of the scene, so that an AR app can incorporate distance to place objects in its overlay.

The multiply-focused shots can also improve the simulation of lens bokeh. Right now, the iPhone essentially takes two shots to computationally isolate the subject from the background. But that's not what real lens bokeh looks like -- it gets progressively softer in front of and behind the subject, among other characteristics. With all that image data, the phone can more accurately compute the intensity and look of the defocus.

In order to perform focus stacking, you have to be able to refocus the lens quickly, accurately and in controllable steps, which usually requires new hardware -- like a new camera module -- and definitely new software.