X

NSF awards $53 million supercomputing bid

The National Science Foundation awards contracts to build a grid that connects supercomputer clusters across the country.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
2 min read
The National Science Foundation has awarded contracts worth $53 million to build a grid that connects supercomputer clusters across the country into a single large computing resource called the Distributed Terascale Facility.

The main part of the work will be handled by the National Center for Supercomputing Applications (NCSA) and the San Diego Supercomputer Center, said NCSA Director Dan Reed.

But a big winner will be IBM, which will build four Linux supercomputer clusters and take home tens of millions of dollars, said Mike Nelson, director of Internet technology and strategy at IBM. The NCSA's cluster will be able to perform 6.1 trillion calculations per second (teraflops), and SDSC's will handle 4 teraflops, Nelson said. Argonne National Laboratory will have a 1 teraflop machine and the California Institute of Technology a 0.4 teraflop machine.

The supercomputers will be made from Intel's "McKinley" CPU, the second generation model of the Itanium line, the National Science Foundation said in a statement. In addition, Qwest will link the computers with a high-speed network that can transfer data at 40 gigabits per second.

IBM has embarked on a project to improve grid computing--which federates high-powered computers to give researchers access to supercomputer calculation facilities--and to speed up access to large databases of information. Big Blue believes the technology, chiefly appealing to academics at present, will become useful to corporations as well.

"We're pulling the pieces together. We'll be providing a lot of hardware that uses the McKinley chip," Nelson said. "We think grid computing could be just as big as Linux."

The system, which Reed and SDSC Director Fran Berman called the Teragrid, will be used for work involving national and international collaborations, Reed said. Computing jobs will include work in the areas of astronomy, cosmology, earthquake simulation, genetics, protein research, drug design, brain research and high-energy physics, Berman and Reed said.

A national board will decide how the computing power is allocated, but using it should be made simpler through the choice of open-source grid software from an organization called the Globus Project, he said.

Scientists won't have to worry about where exactly data is stored or what computers are churning through their calculations. "It's trying to take a distributed cluster and data architecture and build an easy-to-use interface on top of it," Reed said.

Ultimately, the grid will grow to include smaller research networks, link to other grids overseas and even incorporate countless sensors across the world, Reed said.