The National Science Foundation selected the winning bid from five contenders, said Bob Borchers, who oversees the NSF's Computer and Information Science and Engineering group. The computer, to be built in spring 2001, will be available for use by academic researchers studying subjects such as biophysics, global climate change, astrophysics and materials science.
The system will be capable of processing 6 trillion mathematical calculations per second at peak speed and a sustained speed of around 1 trillion calculations, ranking it among the most powerful supercomputers in the world.
Compaq has been gunning for the high-performance technical computing market, which includes not only mammoth supercomputers but also more modest systems able to crunch numbers quickly for scientists or other researchers, such as banks trying to scrutinize global economic patterns.
While SGI has long been a power in this market with computers such as Los Alamos National Laboratory's Blue Mountain supercomputer, the company's financial difficulties mean Compaq is more worried about IBM's growing prominence, said Jesse Lipcon, vice president of Compaq's Alpha technology group. IBM yesterday announced it won an $18 million bid to build a supercomputer for the U.S. Navy.
One upcoming contract many are watching is the LANL project for a new nuclear weapons simulation computer expected to perform 30 trillion calculations per second. Lipcon confirmed today that Compaq is a bidder. SGI and Sun Microsystems also are bidding, representatives of the companies have said, and the announcement of the winning bid is expected soon.
IBM, which has a strong relationship with rival Lawrence Livermore National Laboratory and its ASCI White machine, isn't bidding for the LANL project. But Big Blue is likely to take a crack at its successor, expected to perform 100 trillion calculations per second.
The new Compaq machine is a sort of souped-up version of a popular and inexpensive supercomputer technique called Beowulf, which links dozens or hundreds of computers connected by a network. A programming task is split into numerous independent pieces that are parceled out to the individual nodes.
Though the technique doesn't work for all supercomputing problems, it's growing in popularity, and programmers are getting used to the architecture.
The new Compaq supercomputer will consist of 682 Compaq servers, each with four EV68 processors running at 1.1 GHz, Lipcon said. The chips will be an upcoming design with 0.18-micron features and copper technology, he said, and the four-way servers will be an as-yet unannounced design.
Each of these 682 nodes will be running Compaq's Tru64 Unix, which is capable of sharing a single file system. The nodes will pass messages to each other through a massive switch built by Quadrics Supercomputer World. Although each node can talk directly to another node through the switch, Compaq elected to use two to increase the communication speed.
Compaq's push into the supercomputing market is relatively young. It has won contracts with France's atomic energy commission, animation firm Blue Sky Studios, and three labs involved in the human genome project: Celera, Whitehead and the Sanger Institute.
Compaq has sold about a dozen of its supercomputers so far, Lipcon said. "Expect to see us capturing significant market share in the high end," he said.
Borchers said the NSF is paying $36 million for the machine and $3 million a year from 2001 to 2003 to operate it. In addition, the NSF hopes Congress will fund a second $36 million machine next year and a $55 million upgrade for each of the machines the year after that, Borchers said.