X

IBM, government talk big iron

A variety of high-level government agencies are huddling to map out the future of U.S. supercomputing and to hammer out their differences on the subject.

Michael Kanellos Staff Writer, CNET News.com
Michael Kanellos is editor at large at CNET News.com, where he covers hardware, research and development, start-ups and the tech industry overseas.
Michael Kanellos
3 min read
IBM and a variety of high-level government agencies are huddling to come up with an agenda for the future of U.S. supercomputers and to hammer out their differences on the subject.

More on supercomputing
Representatives from the Department of Homeland Security, Lawrence Livermore National Laboratories, the National Science Foundation, the Department of Energy and other federal agencies met with IBM executives in Washington this week to discuss pressing issues in high-performance computing, according to IBM. They also discussed how to improve the performance of certain supercomputing applications, such as those for environmental modeling.

Armonk, N.Y.-based IBM is one of the global leaders in high-performance computing, accounting for 39 percent of the processing power on the Supercomputer 500 list.

The federal government is one of the largest consumers of supercomputers, employing them for conducting astrophysics research, improving cryptography techniques and monitoring the stockpile of nuclear weapons, for example.

Still, despite the familiarity, the two groups don't always see eye to eye. In general, most private companies want to build systems out of technology that can eventually find its way into a broad spectrum of commercial products and not necessarily machines tailored to specific research demands.

"We're interested in mainstreaming everything," said Nick Donofrio, senior vice president of technology and manufacturing at IBM. "We're only going to do things that are commercially viable."

By contrast, government officials have said that high-end computers built from off-the-shelf components could start to compromise performance--a situation that may force the government to begin start building, or funding the development of, systems built specifically for its needs.

High-end systems built from common components could take longer to develop, according to David Nelson, director of the National Coordinating Office for Information Technology Research and Development (NITRD), which coordinates high-performance systems for agencies such as the Homeland Security and Energy departments. They could also feature limited memory and input/output performance, Nelson noted in a slide presentation at last week's government-IBM meeting.

"Some important applications/algorithms are not amenable to COTS-based (common off-the-shelf) HEC (high-end computers)," read one of the slides in Nelson's presentation. "Federal funding of highest-performing HEC, including development of new systems, may be required."

Examples of research that could be compromised by computers with off-the-shelf parts include protein folding, reusable-launch vehicle design and ocean-state analysis.

A federal task force in August will come out with a report detailing a five-year plan for federal supercomputing development and purchasing, according to the information in Nelson's presentation.

The NITRD did not return calls for comment.

Other sources have said that NEC's Earth Simulator has raised concerns among some lawmakers in Washington about the United States slipping behind in supercomputing, as the Japanese

Invite Michael Kanellos into your in-box
Senior department editor Michael Kanellos scrutinizes the hardware industry in a weekly column that ranges from chips to servers and other critical business systems. Enterprise Hardware every Wednesday.




supercomputer is far more powerful than any other machine. One of Nelson's slides, in fact, graphically depicts the gap between the Earth Simulator and the best U.S.-based supercomputers.

The seeming significance of the Earth Simulator's lead, though, can be exaggerated. "National leadership shouldn't be defined in a moment in time," said Kathleen Kingscott, director of public affairs at IBM, who added that machines built of technology that can be broadly deployed have historically proved superior to specially built systems.

IBM is working on two supercomputers--ASCI Purple and Blue Gene/L for the Lawrence Livermore National Laboratory--that will be, respectively, four and ten times more powerful than the Earth Simulator. ASCI Purple is scheduled to become active in 2004.

Donofrio said that IBM and the various agencies are talking to try to figure out the basic needs and direction for supercomputing. In some instances, he added, some of the performance issues actually arise because the software wasn't originally tailored to run on systems made with off-the-shelf components.

"It is the way the program was written," he said.

Some government agencies, he noted, are also showing increased interest in "on-demand" computing, a service idea being touted by IBM and others under which entities buy computing power the way they now buy electricity.

"A lot of agencies that are under budget pressure would love to get their high-performance computing needs in a more variable fashion," Donofrio said.