SAN FRANCISCO-- Panasonic showed technology on Monday that could shift theoff the computer and directly into a camera image sensor.
And it works through a variation of a familiar photographic technique called exposure bracketing. For years, photographers challenged by tough lighting conditions have taken multiple pictures of the same scene at different brightness levels--bracketing--to help ensure one photo has a good balance shadow and highlight details.
More recently, with the advent of computers, these bracketed exposures can be combined into a single high-dynamic range (HDR) image that captures both bright and dark areas--for example both the subtle tones of both a bride's white wedding dress and a groom's tuxedo--that lie beyond the abilities of the camera taking a single shot.
In research shown here at the International Solid-State Circuits Conference, Tokayoshi Yamada of Matsushita Electric Industrial--better known as Panasonic to most people--showed technology that he said lets an image sensor capture that high-dynamic range information.
With today's sensors, "You can get either highlight or shadow detail, depending on the exposure time. To get much wider dynamic range images, we need to combine these different-exposure images," Yamada said.
Yamada showed a 177x144 pixel image sensor that takes three photos of the same scene in rapid succession. In one example, he said, the first exposure lasts 1.5 microseconds, the second 150 microseconds, and the third 15,000 microseconds (not far from a 1/60 second exposure). Extra circuitry built into the sensor records the data from the multiple exposures and uses an assortment of electronic capacitors to combine it into a single image that spans a greater dynamic range.
The image can span a dynamic range of 140 decibels compared with ordinary sensors with a 60dB range when working at a frame rate of 15 frames per second, the researchers said.
In his presentation, Yamada showed a resulting image taken of a regular incandescent light bulb. With conventional sensor technology, a few of the words printed on the bulb were visible, but most were washed out in a blown-out white patch near the bulb's filament. In the Panasonic sensor's image, not only were most of the words visible, but also the helical coil of the filament was.
Combining multiple exposures has been possible before, but only using technology that recorded the multiple exposures in separate areas called frame memories, Yamada and his Panasonic colleagues said in a paper on the subject.
Despite efforts such as Fujifilm's SuperCCD sensors, camera buffs are often frustrated by the image sensor dynamic range of that's significantly weaker than what the human eye can detect. Although the Panasonic research shows some promise, though, photo nerds should rein in their hopes: the research showed only a black-and-white images so far and is suited "for automotive and security cameras," according to the researchers' paper.