CNET también está disponible en español.

Ir a español

Don't show this again

HolidayBuyer's Guide
Tech Industry

The New World Economy and Social Club

The leaders of the semiconductor industry gather to chart the direction of the New Economy, predict who will be the next global power, and, well, down a few beers and eat chocolate cake.

SAN JOSE, Calif.--The leaders of the semiconductor industry gathered this week to chart the direction of the New Economy, predict who will be the next global power, and, well, down a few beers and eat chocolate cake.

The Semiconductor Industry Association held its forecast and awards dinner Wednesday night at the Fairmont Hotel in San Jose, Calif. The event primarily exists as a vehicle for delivering the trade group's forecast for demand across the different segments of the semiconductor industry.

Naturally, the future looks bright to the SIA. By 2003, chips will constitute a $319.3 billion business, according to the trade group. In 2000, 122 chip acquisitions, valued at over $50 billion, took place. More will follow, according to a report presented by Mark Edelstone of Morgan Stanley Dean Witter.

The increasing importance of semiconductors will even modulate the ups and downs of global economics, predicted Wilfred Corrigan, CEO of LSI Logic. Inflationary peaks won't be as high as they once were because semiconductors--the basis of the New Economy--become less expensive over time.

"What we are to see in the next 10 years is that Europe and Japan will be more affected by oil prices than this economy," he said.

Not that semiconductors aren't critical to modern life. "The (invention of) the semiconductor is the most significant event since man emerged as a life form," Corrigan boomed.

Jerry Sanders, CEO of Advanced Micro Devices, said the public looks at microchips as "near-magical objects possessing nearly supernatural powers...Even in our jaded age, the chip still commands awe."

But just as important, the SIA dinner demonstrated that world domination conspiracies aren't nearly as spooky as they once were. Here were the gathered captains of the New Economy: 500 lumpy, middle-aged executives in full-throttle, back-slapping mode.

Business associates greeted each other and wisecracked about stock analysts, who have replaced reporters as the butt of jokes.

What? No black turtlenecks?
The Alabama Semiconductor Alliance stood around its table, uncomfortably waiting for the moment when it would be OK to sit down. Other than the lady in the red outfit, and the joker with the plaid pants, nearly everyone else wore a blue suit.

How meat and potatoes is this crowd? The salad course was ham.

The scene contrasted sharply with the image of the digital economy held by the population at large. According to public myth, sleek visionaries engrossed with fuzzy logic, linguistic theory and questions of social policy have forged the Internet.

In reality, it's being slapped together by a bunch of guys with comb-overs looking for the bartender's tip jar. Video montages of the pyramids, Roman ruins and Stonehenge played while attendees ate. Their day has gone, the message seemed to be. Civilization now belongs to the silicon version of the Elks Club. As far as aristocracies go, it's not that threatening.

Guts over geekdom
It's not even a class limited to those with exceptional math scores. Although Silicon Valley couldn't exist without well-educated engineers, opportunism--rather than raw intelligence--has always been the hallmark of the industry.

Silicon Valley, after all, owes its history to people who quit their jobs. Fairchild Semiconductor paid its engineers poorly and imposed a top-down, eastern seaboard management style. Those who left to start their own companies included National Semiconductor founder Charlie Sprock, AMD's Sanders, LSI's Corrigan and Robert Noyce, who started Intel. Former Fairchild employment remains a status symbol and is mentioned whenever possible.

Intel's entry into microprocessors was similarly guided by pluck. Calculator maker Busicom contracted with the company to build a programmable chip for a calculator. The result was the 4004, the world's first microprocessor. Federico Faggin, Marcian "Ted" Hoff and Stanley Mazor were given Noyce Achievement awards Wednesday for the 1971 invention.

It contained 23,000 transistors and ran at 750 kilohertz, Faggin noted. If made today, "it could comfortably fit under a bonding pad." Laughter erupted, even among those of us not smart enough to get the joke.

Busicom had foresight in seeing a need for a programmable chip, noted Sanders, but not in understanding how contracts work. "They gave back the rights" to Intel, he noted, inadvertently changing industrial history.

The chip industry's optimism, of course, is tough to counter. Since 1959, the industry has experienced only six years when sales declined from the previous year.

In the 1980s, Japanese manufacturers surpassed U.S. providers in the marketplace. At the time, no U.S. industry had ever managed to regain the upper hand. Silicon Valley was destined to become a "techno colony" of Japan, Intel chairman Andy Grove warned at an SIA dinner years ago. By the 1990s, the situation had reversed. To top it off, Texas Instruments' Jack Kilby won a Nobel Prize.

"We are making a tremendous contribution to society," Sanders said. "We are also making a tremendous amount of money."