IBM data center gets deep energy retrofit

IBM's "green innovation" data center uses cutting-edge technologies, such as temperature sensors, and building designs aimed at cutting energy consumption.

Martin LaMonica Former Staff writer, CNET News
Martin LaMonica is a senior writer covering green tech and cutting-edge technologies. He joined CNET in 2002 to cover enterprise IT and Web development and was previously executive editor of IT publication InfoWorld.
Martin LaMonica
4 min read

SOUTHBURY, Conn.--IBM's "green" data center here is kind of like a techie version of the "This Old House" television show. But in this case, the project was to build a showcase for energy-efficiency computing, rather than construct a new addition for a suburban home.

IBM's main problem was data center sprawl. Five years ago, internal IT staff could barely keep up with growing demand for computing resources from employees, causing an expansion from one data center location to four--a situation that was costly and inefficient.

Now, those four data centers have been consolidated into a single spot with the latest in energy-efficient tech gear, including a network of 200 sensors and water-cooled servers. It also uses what are considered the best practices for physically laying out a data center, with close attention to everything from cabling to air flow.

Photos: Inside IBM's deep green data center

See all photos

Making data centers more energy efficient has been a growing priority for technology managers for the past few years, as companies seek to trim spending on electricity and reduce their environmental footprint. The Environmental Protection Agency in 2007 estimated that data centers alone use about 1.5 percent of all electricity in the U.S. and are on a pace to double consumption in the coming years. In IBM's case, it deals with high volumes--its wikis are used by 365,000 people--and a growing number of applications.

IBM's tech staff did what many others in their position have done: they consolidated their computing workload with virtualization and upgraded to new, more energy-efficient hardware.

But packing more servers--each with multicore processors--into smaller spaces creates more heat, exacerbating the challenge of keeping the space cool. IBM is using a number of techniques to cool efficiently, but the guiding principle is to match the cooling with the required heating load.

"You have to physically integrate the IT and physical (cooling) equipment so you can adjust the air conditioning to match the thermal load--the system should be very dynamic," said Peter Guasti, program director for IBM's Green Innovations Data Center.

Just office buildings or hotels heat or cool rooms even when there are no people in them, many data centers operators don't have fine-grained control over cooling systems. That means the temperature can be set lower than it needs to be or a "hot spot" emerges when one piece of equipment has a heavy computing load.

Combining IT and building architecture
To keep the air conditioning well tuned, IBM is gathering lots of data. Sensors are placed behind, in front of, and top of server and storage racks to monitor the temperature. The data is collected and analyzed so that the air and water cooling systems can be automatically adjusted as needed, Guasti explained.

Operators can get a "thermal map" of the data center based on the sensor data to help find trouble spots. They are also beta testing an upcoming version of IBM's Tivoli Energy Management software, which will be able to incorporate the sensor data into the systems management program.

Watch this: What makes IBM's 'green' data center tick

"The bright idea is not so much putting the sensors in. It's what you do with the data--you get reams of information so you have to try to figure out what's important and not," Guasti said.

Air flows along a predetermined path with "cold aisles" pumping cooled air to the front of equipment from the floor and hot air behind server fans being sucked upward from the ceiling in "hot aisles."

To lighten the overall cooling load, IBM is using its liquid-cooling systems, originally code-named Cool Blue, which fit onto the back of server racks. These heat exchangers cool the hot air coming from servers' fans by circulating cold water through coils, which absorb the server heat and then are cooled using the building's chiller.

IBM is looking at a variety of other ways to lower energy consumption, including using solid-state hard drives and using outside air--filtered to clean out contaminants and humidity--to cool the building, Guasti said.

Saving green or green PR?
The Green Innovation Data Center was designed for tours so customers can get some ideas on how to lighten their own data centers' energy load. But it's not just for show--the center runs applications used by thousands of people.

And the investments IBM made in making the center more efficient are "very cost justified," said Patrick Toole, the company's newly named chief information officer, in an interview. IBM as a company has wrung $3 billion in costs over the past year, which it plans to continue, he said.

But the company measures the "payback" from upgrading its data center not only with energy savings and environmental benefits. It's also measured in business process improvements, Toole said.

For example, the data center allows IBM to operate an internal "cloud computer." Employees can procure computing resources--server processing and storage space, for example--for a certain amount of time on a subscription basis. In the past, employees asked the IT group to install a server for each new application, which is less efficient than a shared-resources model.

Also, the influx of data on energy use lets data center managers better track related costs.

"The instrumentation we have with what is going on is so much more granular than before. We haven't had dashboards with regard to the green aspects before," Toole said. "Now we can see things like energy on a smartphone and we're able to manage that."

Updated at 7:15 A.M. PT with corrected title for Toole and video added.