Your iPhone of the future could deliver better color photos if equipped with three separate sensors.
Granted Tuesday by the U.S. Patent and Trademark Office, an Apple patent dubbed "Image capture using luminance and chrominance sensors" envisions a way to include one sensor for luminance and two for chrominance as a way to better capture colors in your moble device's photographs.
Luminance measures the amount of light reflected from an object, while chrominance defines the color of an image. The technology built into the iPhone would combine the luminance from the first sensor and the chrominance from the other two sensors to form a single composite image. As such, the final image would more faithfully reproduce the actual colors, resulting in a higher quality and more accurate photo.
Each sensor may have a blind spot where it can't detect a certain region included in the photograph. The three-sensor approach would compensate for this by ensuring that the blind regions are all offset by other sensors. So, if one sensor doesn't "see" a certain region, another sensor would pick it up.
A multi-sensor technology has been on Apple's mind. A patent granted to the iPhone maker on July 23 described a way toto create better photos and videos.
Apple - USE TAG
reading•Apple envisions three sensors to improve iPhone's color photos
Nov 16•iPhone XR photos, and how Portrait Mode works
Nov 16•Black Friday 2018 iPhone deals: $150 off iPhone XR and XS, $400 iPhone X gift card
Nov 15•Apple partners with studio A24, of Moonlight fame, to make movies
Nov 15•Costco Black Friday 2018 deals: $300 Dyson, $800 Surface Pro bundle and more