Cambridge, Mass.--Corporations are mishandling their data center energy consumption to the point that they risk disruptive failures of their technology infrastructure, a panel of experts said on Tuesday.
Panelists, which included speakers from the research firm the Uptime Institute, as well as AMD, Hewlett-Packard, EMC and APC, agreed that the use of electricity in data centers is a problem too few IT professionals are addressing. The panel was assembled by AMD.
"The ever-widening gap between computing performance at the server and chip level is not being matched by energy efficiency," said Bruce Taylor, chief strategist and evangelist for data center computing at the Uptime Institute. "This creates a thermal density and power consumption problem that is now a crisis."
The energy consumed by data centers is already becoming a major economic and environmental issue. Three percent of all power consumed in the United States in 2010 will be in data centers, Taylor said.
He forecast that spending on energy for data center equipment and cooling will climb from 1 to 3 percent of IT budgets to between 5 and 15 percent by 2012.
And yet, many data center operators are not aware of the associated energy costs or problems. Taylor cited statistics from a survey of 100 data center operators done last year by Aperture Research Institute, which found that almost 40 percent of respondents said they had run out of space, power, or cooling capacity without sufficient notice.
One of the main reasons why IT professionals don't pay attention to energy consumption is that other people in the organization get the bill. Also, data center construction and design are typically handled by a company's real estate or facilities department.
"There's only so much that can be done at the technology, the platform level, and we have to address this in a more holistic data center level," said Ken Baker, infrastructure technologist at HP.
Indeed, panelists said that there are already several technologies that can lower power consumption significantly.
Virtualization and power-management technologies are becoming more pervasive and, if exploited, are already available and installed. One financial services company has started to levy fines on company departments that don't use virtualization because of the higher energy costs, said John Tuccillo, marketing director for power supply company APC.
Other technologies are coming online but have not yet entered mainstream usage. For example, HP sells a system that will automatically dial down the cooling in a data center based on the heat generated by equipment. This is a far more efficient approach than cooling the entire room to 60 degrees--a temperature that's comfortable for people but cooler than needed for data center gear.
"Many systems within a data center are operated manually," said HP's Baker. "Customers have to get comfortable with higher degrees of automation."
Direct current power, use of outside air for cooling, and liquid cooling are areas that should be explored longer term to lower electricity use. Water cooling can reduce consumption by 5 to 10 percent, but it is 3,000 times more efficient than air, Taylor noted.
When asked whether many of these energy-saving measures pay for themselves in a reasonable amount of time, panelists said that many technologies available now for no cost have yet to be implemented. A survey from AMD found that only 10 percent of users take advantage of existing processor-level power management, which could reduce consumption considerably.
Also, the layout of data centers could be improved for energy savings, which doesn't necessarily cost more.
But most customers are still not aware enough of the associated costs and potential risks to have full-scale energy efficiency programs in place, panelists said.
"You can't manage what you don't measure," said HP's Baker.