IBM has rebuilt an existing data center to use the latest in energy-efficient technologies and building designs.
A typical scene from your average data center, right? Well, not exactly. This is one half of IBM's Green Innovations Data Center in Southbury, Conn., where Big Blue's internal IT staff pushes the green envelope. This facility, which hosts several internal IBM applications, is packed with millions of dollars' worth of IBM hardware, of course, but also some of the latest energy-efficiency techniques.
On the far right, you can see one of those: a back door heat exchanger designed for its high-end iDataplex server system. Called Cool Blue, the system circulates cold water through the door to lower the temperature of the heat coming from servers' fans.
IBM's "green" data center was designed with a specific air flow to optimize the cooling system. Cool air is pumped in from below this raised floor so that it flows to the front of server and storage systems. Alternating with these "cold aisles" are "hot aisles," where the backs of server and storage racks are facing. In the hot aisles are vents to pull that hot air out through the ceiling. Rather than pile all manner of cabling below the floor, this facility has only power cords below the floor and only networking cables on top--set up to ensure that there aren't blockages to air flow.
The crux of IBM's energy management system is a network of 200 sensors that monitor temperature behind, in front of, and above servers. Data center managers need to adjust the placement of the sensors, two of which are placed on this rack's back door, to get an accurate read-out. Temperatures can easily get over 90 degrees at the back of a rack, while the overall facility is only about 70 degrees.
This is the control unit where data from various sensors comes in and the cooling systems are controlled. In addition to getting data on air temperature, sensors can also track temperature and flow rate of the water used in the water-cooling system. Getting the data is vital to understanding the overall thermal picture in the facility, but the bigger challenge is building analytical and visualization tools to help make sense of the reams of information, said Peter Guasti, program director for IBM's Green Innovations Data Center. In addition, responses to changes in temperature have to be automated. So when the temperature for a server rack goes up because it's handling a higher workload, the cooling system needs to compensate in that particular spot, he explained.
The center is beta testing an upcoming version of its Tivoli monitoring software, which can take the data and allow workers to manage energy required to cool and run the data center.
A closer look at the rear-door heat exchanger put on the back of this rack of servers. The coils you can see circulate cold water that is cooled by the building's chiller. While the fans on the server next to it generated heat over 90 degrees, the back of this one was cool to the touch. Data from the temperature sensors is fed into a central console, allowing IBMers to automate both the air and water cooling systems.
This data center is also testing a prototype of another cooling system called Side Car. Instead of being fitted onto the back of a rack, Side Car envelopes both sides of a rack, creating a self-contained cooling system that uses circulating cold water (see the water pipes on bottom left). It could be used to cool one specific "hot spot," said Peter Guasti, the program director for IBM's Green Innovation Data Center, who is holding open the door.
Although it doesn't look very high tech, this "snorkel" can save money on cooling in a data center. Developed by IBM researchers, it's designed to be placed at the bottom of a server rack to direct cool air coming from the floor. Aiming the air could reduce cooling costs, particularly on racks that are not full of servers and only need cooling on one spot. Similarly, IBM puts "blanket plates" over portions of the racks that don't have servers in them. The snorkels have a magnetic backing.
A look at the under-floor controls for the water-cooling system, which is monitored for flow and temperature. The water is cooled by the building's chiller, but IBM set up a local loop of the water so that it can serve individual server racks, a way to improve the overall efficiency.
Although these look like plain posters, they are made of a material to keep the noise level down in data centers. By dampening the amount of vibration off them, the panels make it quieter for people who work in data centers, which can get loud.
This power distribution unit, made by Emerson, is a lot like the circuit breaker in a home, with the ability to turn off power to individual racks. Each circuit is also individually metered, which means that the data center operators have detailed information on how much electricity specific pieces of equipment consume.