X

Supercomputer ranking method faces revision

On the eve of a new list ranking the world's fastest computers, momentum is building for a change in the measuring process.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
5 min read
An effort by PC makers to move beyond a single, simple speed measurement for showing computing performance is now being matched by a similar push in the world of supercomputers--and not everyone is applauding the change.

News.context

What's new:
An organizer of the Top500 supercomputer rankings has produced a broader test suite that measures multiple dimensions of a machine's performance.

Bottom line:
The government-sponsored test suite, called the HPC Challenge Benchmark, has pleased some supercomputer makers, such as Cray. But IBM, which is moving aggressively into the supercomputing market and is featured more prominently on the Top500, is more cautious.

More stories on this topic

An organizer of the Top500 supercomputer rankings has produced a broader test suite that measures multiple dimensions of a machine's performance. By comparison, a mathematical test called Linpack is currently used to rank systems on the Top500 list, which is released twice a year with much fanfare.

"For a long time it's been clear to all of us (that) we needed to have more than just Linpack," said Jack Dongarra, a University of Tennessee professor who helped create Linpack and who's now working on a suite of tests that go beyond pure number-crunching prowess. "No single number can reflect the overall performance of a machine."

In the world of desktop PCs, where increasing a chip's clock speed by 20 percent rarely yields a 20 percent overall system boost, a similar shift away from simple but potentially misleading measurements has already occurred. Chipmaker AMD in 2002 began discarding the gigahertz labeling system. And this spring, Intel made a similar move for its Pentium and Celeron chips.

The government-sponsored test suite for supercomputers, called the HPC Challenge Benchmark, has pleased some supercomputer makers, such as Cray. But IBM, which is moving aggressively into the supercomputing market and is featured more prominently on the Top500, is more cautious.

The new suite of seven tests won't replace Linpack as the Top500 yardstick, Dongarra said. For one thing, the decades-old Linpack permits historical comparisons in high-performance computing, or HPC, and for another, a system that can't get a high Linpack won't do well on other tests, he said.

The new tests grew out of a program the United States government launched after being spooked by a Japanese supercomputer called Earth Simulator, which has topped the Top500 since 2002. The program, funded by the Defense Advanced Research Projects Agency (DARPA), has awarded grants to IBM, Cray and Sun Microsystems to develop new supercomputer designs.

"It was done for DARPA and the National Science Foundation and the Department of Energy. They wanted something to measure the overall effectiveness of computers designed for the program, and they realized that Linpack was not good enough," Dongarra said.

The next Top500 list is scheduled for release Sunday as the International Supercomputer Conference begins in Heidelberg, Germany.

It's not the first time the benchmark suite idea has been raised. Erich Strohmaier, another Top500 organizer, endorsed a composite test in 2000 to supplement Linpack.

New tests, new fans
Some companies are eagerly promoting the new test suite--in particular, Cray. Five of Cray's X1 systems lead in one test, which measures memory transfer speed, in contrast to the company's comparatively unflattering presence on the Top500 list.

"Customers are always going to want to run their particular codes, but it gives a good understanding about how a system performs in different areas," said Stephen Sugiyama, a Cray marketing manager. "They've done a lot of work to pick a few characteristics about systems that matter to customers."

Of the Linpack-based Top500, Sugiyama said, "It's a nice census of very high-performance systems, but when it's used to rank systems, it's not necessarily a good ranking."

Cray has specialized for years in supercomputing. IBM, though, is trying to adapt its general-purpose business servers to the market, with substantial success with Unix servers and clusters of small Linux computers joined by a high-speed network. And Big Blue is more skeptical.

"Everyone understands Linpack for what it is and what it isn't. No one understands these additional benchmarks in terms of what they are and what they are not," said Dave Turek, leader of IBM's "Deep Computing" team. "The hazard is thinking that more benchmarks is more illumination. It might just generate more degrees of confusion."

Many tests represent extreme and potentially unusual computing challenges, and it's not yet clear how well they align with actual customer work, Turek said. IBM recommends customers try out their software before buying a system.

In addition, the benchmark is skewed to reflect the interests of specific government agencies, Turek said, alluding to intelligence organizations such as the FBI, CIA or the National Security Agency.

"Three-letter agencies that all have different kinds of views in terms of what they see as important--they have all stuck something in there to accommodate their kinds of needs," Turek said.

New dimensions
Linpack measures how fast a system can solve complicated algebraic calculations--a test that measures processor performance well but not other aspects of a supercomputer. For example, it doesn't address how fast data is transferred to or from memory or disk storage systems.

And though Linpack tests a type of math called "floating-point" calculations, which involve a continuous spectrum of numbers, it doesn't test "integer" operations, which involve whole numbers. Integer operations are used in problems such as processing genetic sequences.

The HPC Challenge Benchmark suite, in contrast, includes tests such as Stream, which measures how fast data can be transferred from memory to a processor; Ptrans, which measures how fast one processor in a supercomputer can communicate with another; b_eff, which measures the response time and data capacity of a network; and DGEMM, which multiplies one array of numbers, called a matrix, with another.

The benchmark software runs all the tests simultaneously, Dongarra said, so manufacturers won't be able to run just one test or another. However, because the tests measure different aspects of a system, it's not meaningful to wrap the seven results into a single composite score, he added.

Of the seven tests in the suite, only five are measured, and changes or additions are possible, according to the Web site.

Meanwhile, the Top500 isn't going away, despite its imperfections.

"It clearly has a place. It does attract a lot of attention in using this one number to rate machines. There are some bragging rights that go along with it," Dongarra said.