Your iPhone of the future could deliver better color photos if equipped with three separate sensors.
Granted Tuesday by the U.S. Patent and Trademark Office, an Apple patent dubbed "Image capture using luminance and chrominance sensors" envisions a way to include one sensor for luminance and two for chrominance as a way to better capture colors in your moble device's photographs.
Luminance measures the amount of light reflected from an object, while chrominance defines the color of an image. The technology built into the iPhone would combine the luminance from the first sensor and the chrominance from the other two sensors to form a single composite image. As such, the final image would more faithfully reproduce the actual colors, resulting in a higher quality and more accurate photo.
Each sensor may have a blind spot where it can't detect a certain region included in the photograph. The three-sensor approach would compensate for this by ensuring that the blind regions are all offset by other sensors. So, if one sensor doesn't "see" a certain region, another sensor would pick it up.
A multi-sensor technology has been on Apple's mind. A patent granted to the iPhone maker on July 23 described a way toto create better photos and videos.
reading•Apple envisions three sensors to improve iPhone's color photos
May 23•Next-gen Apple iPhone chips reportedly already in production
May 22•Starbucks, not Apple Pay, is the king of mobile payments
May 22•WWDC 2018: All the rumors on iOS 12, iPad Pro, new MacBooks and more
May 22•It's time for the Apple Watch to get a watch face store