I love press releases for really geeky stuff like image sensors, especially the releases declaring amazing breakthroughs. They're fun because there's usually some really interesting development buried in it, but the people who write the releases have no idea what it is. Ditto for many of the Web sites that write about them. So you end up with some verbatim quotes that are so dense, an electron couldn't tunnel through them. This brings me to today's announcement from Panasonic, featuring a rugged, new image sensor designed to withstand the deterioration caused by weather, heat, and ultraviolet light.
Image sensors are inherently fragile. In exceptionally bright environments, the light receptors can overflow from a surfeit of photons, and all that increased photonic activity generates heat, which decreases the sensor's efficiency. General-purpose image sensors usually require tiny microlenses, typically made from polymers, to gather the light and focus it on the individual receptors. Polymers are quite hardy, but even their properties tend to change as the heat rises.
Furthermore, the dye used for the on-chip RGB filters can fade when exposed to UV rays, just as the dyes in your inkjet prints do, rendering the filters less effective. (The press release states "color images captured by a camera used under direct sunlight, including the ultra-violet (UV) portion, and higher temperature conditions will fade faster," but that's not quite accurate. What it should say is that the camera's ability to capture intense colors decreases as it's exposed to shorter wavelengths and hotter temperatures. The images you've shot already are fine.)
Panasonic's new MOS (metal-oxide semiconductor) sensor addresses some of these durability issues by replacing the polymer microlenses with digital microlenses and replacing dye-based color filters with photonic crystal filters. In other words, rather than using a single lens placed atop a photodiode, so-called "digital" microlenses are arrays of inorganic materials (lacking Carbon=Hydrogen bonds), in which each component of the array is smaller than the wavelengths of light it's designed to focus.
(Note: This is where I start guessing, since I haven't yet downloaded the article on which the technology seems to be based, "Degradation-free MOS image sensor with photonic crystal color filter," which was published in the June 2006 issue of IEEE Electron Device Letters.)
Since these array components are so small, they refract the light; the arrays are deposited in concentric circles atop the diodes, directed so that the light refracts inward toward the diode. I envision the arrays as stadium lights, but with refracted rather than emitted light. That's probably wrong, but it's stuck in my head. However, I think the type of lensing is close.
The other key technology, photonic crystal color filters, are basically structures composed of alternating areas that foster or forbid the propagation of electromagnetic waves. The areas which foster the EM, called the band gap, are tuned to specific frequencies. In the case of a general-purpose sensor, those would be the frequencies corresponding to red, green, and blue. But the neatest thing is that they could be tuned to any frequency, within practical and physical limits, to enhance color capture across the spectrum. And they don't rely on a manufacturer's ability to produce a specific color dye, and should be far more stable over time.
So while the press release makes some vague promises for the durability of security and automotive cameras, I see sensors like these as, at least, the precursors for those needed to photograph really hot places with cool color spectra...like Venus.