Armonk, N.Y.-based IBM is one of the global leaders in high-performance computing, accounting for 39 percent of the processing power on the.
The federal government is one of the largest consumers of supercomputers, employing them for conducting astrophysics research, improving cryptography techniques and monitoring the stockpile of nuclear weapons, for example.
Still, despite the familiarity, the two groups don't always see eye to eye. In general, most private companies want to build systems out of technology that can eventually find its way into a broad spectrum of commercial products and not necessarily machines tailored to specific research demands.
"We're interested in mainstreaming everything," said Nick Donofrio, senior vice president of technology and manufacturing at IBM. "We're only going to do things that are commercially viable."
By contrast, government officials have said that high-end computers built from off-the-shelf components could start to compromise performance--a situation that may force the government to begin start building, or funding the development of, systems built specifically for its needs.
High-end systems built from common components could take longer to develop, according to David Nelson, director of the National Coordinating Office for Information Technology Research and Development (NITRD), which coordinates high-performance systems for agencies such as the Homeland Security and Energy departments. They could also feature limited memory and input/output performance, Nelson noted in a slide presentation at last week's government-IBM meeting.
"Some important applications/algorithms are not amenable to COTS-based (common off-the-shelf) HEC (high-end computers)," read one of the slides in
Examples of research that could be compromised by computers with off-the-shelf parts include protein folding, reusable-launch vehicle design and ocean-state analysis.
A federal task force in August will come out with a report detailing a five-year plan for federal supercomputing development and purchasing, according to the information in Nelson's presentation.
The NITRD did not return calls for comment.
Other sources have said that
The seeming significance of the Earth Simulator's lead, though, can be exaggerated. "National leadership shouldn't be defined in a moment in time," said Kathleen Kingscott, director of public affairs at IBM, who added that machines built of technology that can be broadly deployed have historically proved superior to specially built systems.
IBM is working on two supercomputers--and Blue Gene/L for the Lawrence Livermore National Laboratory--that will be, respectively, four and ten times more powerful than the Earth Simulator. ASCI Purple is scheduled to become active in 2004.
Donofrio said that IBM and the various agencies are talking to try to figure out the basic needs and direction for supercomputing. In some instances, he added, some of the performance issues actually arise because the software wasn't originally tailored to run on systems made with off-the-shelf components.
"It is the way the program was written," he said.
Some government agencies, he noted, are also showing increased interest in "on-demand" computing, a service idea being touted by IBM and others under which entities buy computing power the way they now buy electricity.
"A lot of agencies that are under budget pressure would love to get their high-performance computing needs in a more variable fashion," Donofrio said.