CNET News Video
What makes IBM's 'green' data center tickCNET News' Martin LaMonica gets a tour of IBM's lab for green IT where the data center uses networked sensors and liquid cooling to lower energy use.
[ Music ] ^M00:00:04 >> This is Martin LeMonica from CNET News. Earlier this week, I got a tour of IBM's Green Innovation Data Center in Southbury, Connecticut from program director Peter Guasti. ^M00:00:15 >> Tell us, first of all, what you're looking at here behind you; and tell us why it's so quiet where we are. ^M00:00:21 >> These actually -- they look like [inaudible] right now, but they're actual dual function. They actually have an -- aesthetically looking images on it but they're actually -- the real purpose of these is actually acoustic panel. ^M00:00:31 They actually deaden the ambient noise in the room so that you can actually be more pleasant when you're in here. ^M00:00:38 The noise isn't as high and hurting your ears. These actually act as a dual purpose. They're actually acoustic panels to actually dampen the noise in the room. ^M00:00:47 >> Right. Now, the picture behind you is actually of the thermal map of this data center; ^M00:00:52 and I guess all the green points on there are censors; is that right? ^M00:00:58 >> Yeah, absolutely. So the first step in this is we instrumented the lab. We instrumented the lab with over 200 censors. Then we went to the next step of that and connected them. ^M00:01:07 So now all the censors are connected so they can actually -- they can talk together. And we aggregated them into one focal point in a programmable, logical control; ^M00:01:15 and now we're applying intelligence on that, pulling information off that aggregation point to figure out what valuable information at data center helps us optimize the data center efficiency in terms of energy, power, and cooling. ^M00:01:27 So that's where we're at right now. We want to apply IBM software technologies to actually create the data center and make it more dynamic so that it can actually be -- dynamically adjust cooling levels to the thermal load of the environment. ^M00:01:39 >> Now, you said there are about 200 censors in here; and they're all placed in front of and behind servers, right? ^M00:01:44 >> Yeah. Right. >> Or storage units. ^M00:01:46 >> There's about 200 censors. Again, they're close. They're placed in front of the racks, behind the racks, and in the ceiling and floor. ^M00:01:53 Because we get a good picture of the thermal envelope of the room. ^M00:01:57 >> But it's automated, correct? ^M00:02:00 >> Correct. We're actually at the point now where -- find the intelligence to make it much more efficient to help us automate this. ^M00:02:07 There's too much information coming in so that a human connection look at it. ^M00:02:11 So we actually have to automate that so the data coming in can be correlated to figure out what's important. ^M00:02:16 What valves and what variables can we adjust and control to affect the environment to make it operate into an optimal zone. ^M00:02:24 >> Now, what kind of piece of equipment can you control? Because there's an IT system; but you're linking back to the heating and cooling, right? ^M00:02:30 >> So we actually sense the information from the environment, we can pull power energy information off the IT equipment. ^M00:02:38 But we actually control the facilities equipment. The facility equipment and things like the computer room air conditioning units, we can actually control the fan speed in it to apply more cooling into the room, if needed, or less cooling, depending on the thermal load, based on the censor information. ^M00:02:52 We also, on selected devices that are very highly utilized, we have localized rack-level cooling. ^M00:02:58 We can actually control how much the liquid cooling or chilled water gets sent to those specific rack-level devices so that, if we find out that a particular server is being highly utilized and the temperature's increasing, we can actually control the water flow to that to actually make sure it maintains the same operational range. ^M00:03:15 >> I mean, you've been collecting this data for a while. I mean, how much variability is there? I mean -- I guess, what kind of picture do you see from the data? ^M00:03:23 >> Right now, we're still in the collecting the data phase. But we can tell you that, for sure, that this -- the environment isn't fixed or anything. ^M00:03:30 It constantly changes, depending on workload. These systems here are used throughout the corporation around the world. ^M00:03:36 Up to 360 thousand IBMers around the world access the system. So the workload is very variable, right? ^M00:03:42 So, depending on who's accessed them and the time of day, it does change. So you actually can set it at a fixed, constant cooling point and satisfy the thermal dynamics of the room. ^M00:03:54 So we actually see this fluctuate quite frequent, depending on the time of day. ^M00:03:58 So, we're actually in that stage now where we're actually collecting that data, trying analyzing it, figuring out what we could do to better use that [inaudible] to optimize the room. ^M00:04:07 >> Okay. And all this will be -- eventually make its way inside the products, the hardware and software, right? ^M00:04:11 >> Yeah. We're actually working across IBM with IBM hardware and software divisions, system technology group, and Tivoli software brand to actually take this information and actually incorporating it into their products. ^M00:04:23 >> Okay. >> At some point. ^M00:04:24 >> Okay. All right. Well, thanks for the info, Peter. ^M00:04:26 >> You're welcome. ^M00:04:27 [ Music ]