Apple envisions three sensors to improve iPhone's color photos

A newly granted patent describes a mobile device equipped with three sensors to enhance the colors in your photos.

Lance Whitney Contributing Writer
Lance Whitney is a freelance technology writer and trainer and a former IT professional. He's written for Time, CNET, PCMag, and several other publications. He's the author of two tech books--one on Windows and another on LinkedIn.
Lance Whitney

Your iPhone of the future could deliver better color photos if equipped with three separate sensors.

Granted Tuesday by the U.S. Patent and Trademark Office, an Apple patent dubbed "Image capture using luminance and chrominance sensors" envisions a way to include one sensor for luminance and two for chrominance as a way to better capture colors in your moble device's photographs.

Luminance measures the amount of light reflected from an object, while chrominance defines the color of an image. The technology built into the iPhone would combine the luminance from the first sensor and the chrominance from the other two sensors to form a single composite image. As such, the final image would more faithfully reproduce the actual colors, resulting in a higher quality and more accurate photo.

Each sensor may have a blind spot where it can't detect a certain region included in the photograph. The three-sensor approach would compensate for this by ensuring that the blind regions are all offset by other sensors. So, if one sensor doesn't "see" a certain region, another sensor would pick it up.

A multi-sensor technology has been on Apple's mind. A patent granted to the iPhone maker on July 23 described a way to combine two or more sensors in a mobile device to create better photos and videos.

(Via AppleInsider)