X

Study: Supercomputer clusters shortchange security

Group says popular technique threatens U.S. security by sidelining other approaches more suited to decryption and the like. Photo: IBM's unusual design Photos: The fastest computer on Earth--for now

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
4 min read
The popular "clustering" approach to supercomputing is indeed useful, but U.S. researchers need to explore different directions in the field to ensure the country's security, an academic panel has concluded.

Large clusters of conventional servers, machines that most often use mainstream Intel processors and the Linux operating system, are sweeping the industry and now account for 296 of the 500 fastest supercomputers, according to a list released Monday.

But the United States needs to underwrite research into new hardware and software to solve problems such as decrypting codes that clusters can't handle, said a group of researchers who are planning to unveil a study at a supercomputing conference in Pittsburgh on Friday.


CLICK TO ENLARGE
In the past, supercomputer customers were limited by money rather than technology, but the pipeline of new ideas needs to be refilled, said Susan Graham, co-chair of the study and a computer professor at the University of California, Berkeley. "Our concern is that unless we worry about it now, there's going to come a point in the future...where the capability isn't there because we've let our national expertise atrophy."

The two-year, 222-page study, "Getting Up to Speed: The Future of Supercomputing," will be presented at the SC2004 supercomputing conference. The National Research Council, part of the National Academy of Sciences, performed the work under funding from the Energy Department.

The report calls for the government, including Congress, to take a more active role in the development of supercomputing. The National Science Foundation should spend $140 million per year on a variety of small and large programs, while overall government spending should ensure top agencies can meet their total supercomputing need of about $800 million per year, the report says.

Related stories
The quest for speed
Supercomputers that only a few years ago seemed to be the stuff of fantasy are steadily joining the mainstream.
And federal supercomputer customers should take cooperative responsibility for outlining their needs. "Everybody is a customer, but nobody thinks it's his or her agency's mission to worry about the supply," Graham said. "By getting together jointly to figure out requirements for the future, they can jointly make sure that some agency within the government is funding the research."

Government subsidies have to be handled carefully, though, especially when cultivating work the mainstream market isn't interested in, warned Dave Turek, vice president of deep computing at IBM.

"Engendering investments in areas not synchronized with what the market wants...runs the risk of making the industry less competitive over time by stealing resources" that could have been put to more fruitful use elsewhere, Turek said. "The delicate issue is how far do you go before you go down the path of propping up uncompetitive companies. I think that the marketplace is a great place to shake out competing ideas to see what makes sense."

Big Blue is involved in one government-funded supercomputing project run by the Defense Advanced Research Projects Agency, which is funding IBM, Cray and Sun Microsystems to work on advanced supercomputer designs. Graham praised it but said it's only a one-time program and doesn't support research necessary for a successor.

The report praises clusters but says they're not sufficient for all tasks.

"The advances in mainstream computing caused by improved processor performance have enabled some former supercomputing needs to be addressed by clusters of commodity processors," the report says. "Yet important applications, some vital to our nation's security, require technology that is only available in the most advanced custom-built systems."

Study co-chair Marc Snir, head of the computer science department at the University of Illinois at Urbana-Champaign, pointed to decryption as one onerous task. "Clusters are good for problems that can be decomposed so you can work on chunks of the program reasonably independently without too much communication between the nodes. If the encryption can be decomposed, than the encryption isn't good because it didn't scramble things well," he said.

One technology the government needs, vector supercomputing, is in danger of extinction, the report says. Vector processors, such as those used in NEC's powerful Earth Simulator supercomputer, can communicate with memory very quickly and excel at some widely used mathematical operations. But it appears they "are not viable products without significant government support," the report says. Even though there is no broad market, "The U.S. industrial base must include suppliers on whom the government can rely to build custom systems to solve problems arising from the government's unique requirements," the study says.

Here, though, mainstream business technology could be relevant. IBM is adapting its conventional processors so they can be yoked together into a virtual vector processor.

Snir believes clusters may even have set supercomputing back in some areas. "Because clusters coming from Dell or HP or IBM are so good, the market for custom machines has shrunk," Snir said.

But clusters are growing more advanced, said Don Becker, chief technology officer of Penguin Computing and a pioneer of the "Beowulf" idea of Linux clusters. "I think only a tiny number of problems won't be handled by clusters five years from now," he predicted.