Assessing only pure performance is passe. The debate these days is about, which seems like it should be a simple miles-per-gallon type of calculation. However, miles are miles, and gallons are gallons. There's no one simple way to measure processor performance, and measuring the amount of power output by today's chips is proving just as difficult.
"We do have industry-standard benchmarks for performance, however imperfect they may be," said Gordon Haff, an analyst with Illuminata. "We do not have industry-standard benchmarks for performance-per-watt," he said, adding that it might be some time before those are developed.
Intel and AMD have taken to the PowerPoint slides over the past week, presenting different ideas of how much power is consumed by the other's platform. After AMD took thelast week during its analyst meeting, Intel countered with a briefing Friday morning with Kirk Skaugen, vice president and general manager of Intel's server group.
Intel is getting set to release Woodcrest, a new server processor, on June 26, Skaugen announced Friday. Woodcrest will be a major boost to Intel's server division when it comes to performance, and Intel believes it will retake the lead in this new metric of performance per watt, he said. AMD, however, is unwilling to concede that point just yet, although its competitive marketing no longer cites the pure-performance benchmarks that have carried it through the last two years.
Processor and server vendors often point to several well-known benchmark tests when they want to measure processor performance in certain types of situations, such as the various TPC (Transaction Processing Performance Council) benchmarks for online transaction processing or Web serving, and the SPEC (Standards Performance Evaluation Corporation) tests for measuring integer and floating-point performance. But vendors spend millions tweaking their systems to produce favorable results on those tests, which means most customers insist on running test systems in their own environments before making a decision.
There's even less precedent for measuring power consumption. Traditionally, vendors have pointed to TDP (thermal design power), which is a specification provided to server and PC makers as a guideline for the cooling systems they must design into their products. TDP is measured by tracking a processor's thermal output when it is running at 100 percent utilization.
Parsing the differences
But TDP is somewhat unrealistic, in that servers and PCs aren't usually running anywhere near full bore for extended periods of time. "This is like going around your house and counting light bulbs to determine the monthly power bill," Skaugen said.
AMD measures its power consumption using a "max power" figure, which represents the single largest amount of power that the chip can possibly accommodate, said Brent Kirby, product manager for servers and workstations at AMD. "We're conservative," he said, noting that in 2003 when AMD was trying to break into the server market with Opteron it couldn't afford to have any thermal problems, so it rated its chips at the maximum number.
The trouble is that hitting that maximum power number is even more unrealistic than TDP, said Cory Roletto, a platform marketing engineer at Intel. For that reason, Intel uses TDP numbers to rate its chips. However, while AMD's TDP numbers are its maximum power numbers, Intel's TDP numbers are lower than its maximum power numbers. When it makes its comparisons, AMD uses the "max power" numbers that Intel publishes in its technical documents. This makes it look like there is more of a difference between the two companies in real-world situations, even though AMD's Kirby conceded that the maximum power number cited is virtually unattainable for almost any real-world workload.
Many ways to get results
Intel's argument in claiming performance-per-watt superiority is that Woodcrest represents a new performance lead, putting it so far ahead of AMD that Intel wins the performance-per-watt game as well. The company has released results based on preproduction systems. Those results would appear to give Woodcrest a clear lead over AMD systems.
But in the configuration information listed in the fine print, in some cases the Intel systems were using twice as much memory as the AMD systems, such as in the TPC-C, SAP and Lotus Domino tests. Intel claimed a 49 percent, 21 percent and 30 percent advantage, respectively, in those tests. To be fair, Intel also published the results of other tests in which the AMD system had twice as much memory and Intel still prevailed by a healthy margin.
The confusing result is that the numbers both companies are throwing around in presentations to financial analysts and the press are open to a fair amount of interpretation, depending on where your loyalties are. If there are multiple ways of measuring performance, and multiple ways of measuring power, then there are even more ways of measuring performance-per-watt.
So how does a server buyer make a decision? The old-fashioned way: Insist on a test system. The only way to be totally sure which particular system is best suited for a particular environment is to run that system in the environment, both Kirby and Roletto agreed. Buyers can test the performance of their application environment, and use power meters to test the power consumption of these systems "at the wall."
This is by far the best way to measure power consumption, said John Enck, an analyst with Gartner. Some IT managers Enck has spoken to are starting to measure kilowatts per rack, or how much energy it takes to keep a rack of servers up and running.
Of course, this hasn't stopped either company from spending lots of time and effort throwing around the numbers in an attempt to influence buyers. What would be nice is an energy rating, like those on refrigerators, that shows how much energy a server consumes compared with the range of available products, analysts said. AMD and server vendors have started to discuss these issues in a consortium called the, but it will be hard to get everyone on the same page.
"Without naming any names, if Company A is lagging Company B in performance per watt, they are not going to be thrilled about signing up for some agreed-to industry-standard benchmark that puts them behind." Haff said.