CNET también está disponible en español.

Ir a español

Don't show this again

Tech Industry

Celebrating 60 years of transistors

The modern world wouldn't be what it is if it weren't for a little piece of technology that emerged from Bell Labs in 1947. Photos: Sixty years of transistors

On December 16, 1947, John Bardeen and Walter Brattain, two Bell Labs researchers, built the world's first transistor.

Their device, called a point contract transistor, conducted electricity and amplified signals, a job then currently handled by bulky and delicate vacuum tubes and other components.

Their colleague William Shockley followed soon after with junction transistors. Although Bardeen and Brattain were first, Shockley's device became the basis for a scientific and industrial juggernaut.

"It is the seminal device in terms of the way we think about information, and information is everything, from the music we listen to (to) the TV we watch," Intel CTO Justin Rattner said. "Modern communications is all based on theories of information, not on how many megawatts we can pump into the antenna. It is how clever we can be finding those few faint signals and putting them to use, which is a computing problem."

He added: "You couldn't have five tubes in your iPod."

Besides making it easier to store information and send signals, transistors had another, somewhat unanticipated, characteristic. They could be shrunk at a consistent rate over time, which makes transistors and electronic products steadily cheaper and faster.

The effect, ultimately expressed as Moore's Law, encouraged investors to pour money into high-tech outfits because people had at least some level of assurance that tomorrow's products would be noticeably better than the ones available today. A high tolerance for risk has become one of the defining traits of Silicon Valley.

For tech companies, Moore's Law also served as a threat. Companies that chose not to invest in the new manufacturing techniques or components would quickly fall behind. Thus, innovation has become a matter of simple survival.

Predicting the end of Moore's Law is a cottage industry. If it does end, the heady lifestyle could slow down. Consumers would simply stop replacing their computers or other devices as fast as they do now and resort to getting new stuff when it breaks, Dan Hutcheson, CEO of VLSI Research, has said.

To date, though, the naysayers have been wrong. Lithography, the technique exploited to draw circuits, was supposed to hit a wall at 1 micron, and then at 250-nanometer manufacturing. That's because, some people theorized, it would be impossible to draw circuits smaller than the wavelength of light used by lithography machines. The industry blew past the 250-nanometer manufacturing mark in the mid-1990s. (A micron is a millionth of a meter, and a nanometer is a billionth of a meter. The measurement refers to the average feature length of a chip.)

Chips now come out of factories with 45-nanometer features, thanks to the introduction of metal gates in transistors, a massive change.

Many believe that Moore's Law, as it applies to existing technologies, may peter out around 2020. The structures inside transistors--particularly an insulating layer called the gate oxide--will by that time consist of only a few atoms.

Nonetheless, optimists say chip designers will stop shrinking transistors and instead begin to stack them. Toshiba has . The economic and performance benefits would continue to grow. Intel and IBM are also working on transistors with two or three gates, which would have a similar effect to going 3D.

Others believe chip designers will find a way to harness quantum effects--that is, replacing electronic signals with another physical phenomenon.

"We just can't turn the crank. If anything, it is becoming more difficult, and we will see many more dramatic changes than we've seen in the last 40 years," said Rattner. "Will we call it Moore's Law when the transistors don't use electrical charge? At some point we will make a transition from charge-based transistors to something else. As long as we preserve the basic tenets of Moore's Law, I think people will still call it that."