X

InVisage aims to remake camera sensor market

A Silicon Valley start-up believes its image sensor technology will dramatically improve smartphone cameras by gathering light more efficiently.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
6 min read

People are flocking to a new generation of smartphones with rich applications, high-powered Web browsers, and large touch screens. What those products lack, though, is a camera that's equally transformative.

A start-up called InVisage expects to change that for consumers next year with a new approach to digital camera image sensors. Its technology, called QuantumFilm, is four times more efficient at capturing light than traditional silicon-based image sensor chips, meaning the company's sensors will offer either higher sensitivity in low light or more megapixels in resolution.

A prototype of InVisage's image sensor chip.
A prototype of InVisage's image sensor chip. InVisage

"With a tiny smartphone 3-megapixel sensor, we could make that a 12-megapixel sensor," said Chief Executive Jess Lee. "Or we could quadruple its sensitivity and ISO. That's the net benefit here." Higher sensitivity means photos that aren't as afflicted with the flecks of color that mean the sensor is capturing noise instead of what a person wants to photograph.

The Menlo Park, Calif.-based company is set to demonstrate its products at the Demo conference in Palm Springs, Calif., on Monday, coming out of stealth mode in the process. Specifically, it'll show images produced by a sensor whose pixels measure only 1.1 microns, or millionths of a meter, on edge.

Essentially, the technology works by adding a new finely tuned light-sensitive layer on top of the silicon chip, Lee said. That layer is more efficient at converting incoming light into electrical signals, and the light isn't partially blocked by a microprocessor's metallic layers, either.

Those who make camera sensors, including Panasonic, Sony, Canon, Micron Technologies spinoff Aptina Imaging, and OmniVision Technologies, have been working to snatch as many photons as possible that come through the camera lens. Among other things, they've reduced the size of circuitry that gets in the way of capturing light, thereby increasing the "fill factor" of each pixel; they've flipped the sensor design around so the circuitry doesn't get in the way of the silicon in an approach called back-side illumination; they've come up with "gapless" microlenses that gather light from one edge of the pixel to the other and focus it on the light-gathering area.

And those sensor makers have made steady progress. In particular, SLR cameras can shoot at ISO sensitivity settings as high as 102,400 in a couple cases. But SLRs use large, expensive sensors that don't fit in a mobile phone camera's physical housing or price constraints, and smaller sensors require some combination of fewer megapixels and smaller pixels with lower sensitivity.

InVisage believes its approach offers a much larger leap in improvement than the existing industry has come up with so far, and though it's aiming initially for high-end mobile phones, the technology will work on ordinary digital cameras, security cameras, and military night-vision systems as well, Lee said.

The company has ambitions to remake the image sensor market, but doing so isn't easy. Foveon, another Silicon Valley image sensor start-up, has had only niche success, for example. And it's going up against major chipmaking experts with established businesses.

Competitor OmniVision has 1.1-micron pixels, too, with its own partnership with Taiwan Semiconductor Manufacturing Co. (TSMC), and said the process will work with sub-micron pixels as well.

InVisage has backing in the form of more than $30 million raised from RockPort Capital, Charles River Ventures, InterWest Partners, and OnPoint Technologies. It's got 30 employees to date and a manufacturing partnership with TSMC, as well.

And Lee argues InVisage has an advantage over incumbent powers: its technology doesn't require as advanced manufacturing equipment to make. OmniVision's 1.1-micron pixel sensor requires manufacturing equipment that can make features as small as 65 nanometers, or billionths of a meter, but InVisage's requires only 110-nanometer equipment, Lee said.

InVisage Executives include Lee, who previously was a vice president of OmniVision and also worked at Altera, Silicon Graphics, and Creative Labs; nanotechnology researcher and Chief Technology Officer Ted Sargent; and Marketing Director Michael Hepp, who worked at OmniVision as in product marketing and program management and also worked at National Semiconductor.

InVisage is starting with smartphones first because it's an established, high-volume market. "We're working with two top-tier handset manufacturers already," Lee said, declining to mention them by name.

The company will begin producing samples of its chips by the end of the year. With mass production typically taking six to nine months after that, people could start seeing them in products by mid-2011, Lee said.

Making it work

Part of the company's sales pitch is that its technology integrates with mainstream chip manufacturing at chip foundries such as TSMC. InVisage makes the QuantumFilm material itself, hands it off to a chemical company that makes a liquid suspension out of it, and sends cartridges to the chip foundry. There, it's layered onto silicon wafers using the conventional spinning process that produces a thin film of the material across the top of the wafer.

The quantum-dot layer is sandwiched between conventional image sensor technology. One one side is conventional circuitry for reading data off an image sensor, and on the other is the color filter array that means each pixel receives only red, blue, or green light.

What is a quantum dot exactly? "It's a semiconductor particle made to be precisely nanometers in size. By controlling its size, you can change its core semiconductor property, called band gap," which is a specific amount of energy it takes to get an electron into a higher-energy state. QuantumFilm's materials are specifically tuned to be sensitive to the energy from the photons of visible light.

Conventional CMOS image sensors use the silicon semiconductor layer at the bottom of a chip to gather light, somewhat inefficiently. InVisage believes its QuantumFilm approach, which gathers light at the top layer with precisely made quantum dots in a thin film, is superior.
Conventional CMOS image sensors use the silicon semiconductor layer at the bottom of a chip to gather light, somewhat inefficiently. InVisage believes its QuantumFilm approach, which gathers light at the top layer with precisely made quantum dots in a thin film, is superior. InVisage

Silicon is sensitive to light, but more to infrared and red light and not well to green and even worse to blue. The QuantumFilm is much more sensitive to all visible colors, and not hidden behind two layers of metal within the chip, Lee said.

"We're at about four times the overall raw sensitivity (of) traditional CMOS pixel," Lee said, referring to the complementary metal oxide semiconductor technology used in mainstream computer chips and many image sensors.

The company is deliberately trying to integrate with the existing image sensor industry to make it easier for phone makers to buy the chips. They can obtain them from an established fab, TSMC, and the red-green-blue data the sensor produces is the same as from conventional chips.

Adapting QuantumFilm to ordinary chipmaking was tough. "We have to conform ourselves to what's conventional out there. Tweaking the film to match this conventional process was one of the biggest challenges," Lee said.

Future directions

InVisage has plenty of ambitions for future products.

One avenue is tackling other markets besides smartphones.

"We're still weighing our options there. The smartphone effort will take a little effort to get through," Lee said. Compact cameras is one fit where more sensitivity or resolution would be welcome, but InVisage's technology has another potential benefit for other markets. Because it's sensitive to many different frequencies of light, it could be used in security and military applications where it could be beneficial to show an infrared view sometimes and visible light at others.

"Multispectral imaging is an exotic area people have been studying," but past approaches have required unusual combinations of materials such as both silicon and gallium arsenide chips, and needed significant cooling to work. This higher-end work, though a lower-volume market, "is a great opportunity for us," Lee said. OnPoint Ventures, he added, is associated with military technology work, so there close attention paid to this area.

Also exotically, the quantum dot technology could be used to capture infrared and visible light simultaneously, an approach of interest to 3D cameras that gauge the varying distance to the subjects they photograph by measuring the time it takes for infrared light to travel from the camera and back. That requires distance data for each pixel in the final image, and InVisage's technology could pair the infrared pixels with the visible light pixels.

The company also has plans to move beyond today's era of separate pixels for detecting red, blue, and green light. That approach, called the Bayer pattern, divides the grid of pixels into a checkerboard pattern, and through a process called demosaicing, the camera has to make its best guess to fill in the missing data so each pixel has values for red, green, and blue.

InVisage believes it can stack three layers of QuantumFilm--blue on top, then green, then red at the bottom--to capture all three values of light at each pixel location. Another possibility is to pattern its quantum dots specifically into a color array. Either approach gets rid of the color filter array atop today's image sensors, letting more light through to the sensor.

Another idea the company has is to get rid of the infrared filter that today's image sensors typically employ to screen out unwanted light frequencies. InVisage needs a more tightly controlled quantum dot manufacturing process before that sharp cutoff can be assured, though, so for now its image sensors will require the filters.

But first, the company plans to prove itself with its narrower smartphone agenda. "We like to go step by step," Lee said.