X

Data centers get cool--literally

Hewlett-Packard introduces a lab-grown analysis service that helps companies keep data centers cool--and could help their centralized computing run better.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
4 min read
Hewlett-Packard is taking a simulation technology out of its labs and using it to help companies cool off equipment-packed data centers, easing the growing adoption of centralized computing.

The tech giant launched Tuesday a service that analyzes the air flow in a data center--a facility filled with server, storage and networking systems--to find the best arrangement of computing and air-conditioning gear inside. The service, which uses a complex modeling technology from HP Labs, can cut the energy spent cooling data centers by as much as 25 percent, according to HP.

Keeping data centers cool is important because overheated computers can lose data or crash. New technologies, such as server consolidation, are leading banks and other companies to centralize computing operations and to use blade servers and other systems that cram in hot processors ever more densely. Until now, a typical response from data center operators to technology changes such as these has been brute force--bringing in bigger air conditioners, for example.

"Most information technology people are not trained in thermodynamics," said Illuminata analyst David Freund.

Current data centers--specialized chambers dominated by hulking computer cabinets, uncomfortably chilly air and the roar of hundreds of computer fans--typically have raised floors, under which cool air flows and power lines and networking cables are laid. Cool air is directed upward to computers, though some of it escapes through holes for cables. Intake ducts at the top of the room draw off the heated air and send it to a cooling system.

The first version of HP's service is a one-time analysis of a company's data center to give a prescription for the best way to arrange the computing equipment, the flow of cool air into the facility and the flow of hot air out, said Brian Donabedian, an HP site planner and environmental specialist.

HP's service uses a technique called computational fluid dynamics to simulate how air flows through a complicated arrangement of ducts, computers and deflectors. The Palo Alto, Calif.-based company began showing off the technology behind the analysis service in 2001.

Within two years or so, HP will begin offering a more sophisticated second-generation cooling service tied to its Utility Data Center product, said Donabedian. UDC distributes computing jobs across groups of servers and storage systems and can respond to changing workload demands automatically.

Click here to Play

HP robot helps data keep its cool
In this second, "dynamic smart cooling," phase, the UDC control software will be able to move computing work away from hotter areas of a data center or adjust air conditioning systems to deal with hot spots, Donabedian said. It will combine stationary temperature sensors with others mounted on an HP robot patrolling the data center.

With the cooling analysis service, HP hopes to boost its attempt to increase revenue from its profitable services group. In the wake of IBM's success in selling services, many computing companies are seeking to earn extra money by offering to help customers install and run complicated computing equipment.

The service will appeal only to some large customers initially, Donabedian said. "It is fairly complex, time-consuming and could run into some money," he said.

Early customers for the service include the DreamWorks digital animation studio and Pacific Northwest National Laboratory, which is building a mammoth supercomputer with hundreds of two-processor Itanium servers from HP. The laboratory likely will use the dynamic cooling technology, Donabedian said.

Forcing the issue
In their earlier days, computers had relatively few processors--the hottest part of a machine--in a large cabinet. But new server models are forcing the cooling issue, said Illuminata's Freund. Rack-mounted and "blade" server models currently fit dozens, or hundreds, of processors in a six-foot rack. In addition, high-end multiprocessor servers from Sun Microsystems already have begun topping the 100-processor mark in a single chassis.

Compounding the issues posed by hotter machines is the technology that makes centralized computing more popular. For example, most higher-end Unix servers today can be divided into multiple "partitions," each with its own copy of the operating system. That feature makes it easier to replace numerous widely dispersed low-end servers with a single easy-to-manage large server, a trend called "server consolidation."

At the same time, special-purpose storage area networks (SANs) let administrators more easily use large, centralized storage systems instead of smaller ones attached to each server.

Rival manufacturers--and their customers--are increasingly looking at the overall cost of running systems, taking into account, for example, electricity charges as well as computer price tags, said Charles King of the Sageza Group analyst firm.

"To save money over time, you need to start looking at cooling costs, the cost of doing wiring maintenance, the cost of bringing a server online," King said. "IBM in particular has been really aggressive in the holistic view of the data center."

The liquid cooling solution
In the long run, adjusting air flow will only help so much with cooling data centers, and more radical changes will be required. The top contender is cooling with water, or some other liquid that's more efficient than air at absorbing heat from processors.

This means liquid-cooled systems, such as older IBM mainframes and Cray supercomputers, could experience a renaissance. IBM researchers working on a super-dense storage system called Collective Intelligent Bricks have begun advocating a return to liquid cooling.

"We are getting actually feedback from high-end customers who say--about water cooling--'What took you so long to get back to it?'" said IBM researcher Winfried Wilcke in an e-mail interview.

While IBM favors centralized water-cooling systems, HP's Donabedian believes smaller, more local cooling also is feasible. It's possible to liquid-cool individual processors or to spray processors with special fluids that can be collected after they cool a chip, then recycled.

"Probably within about five years you'll really begin to see liquid cooling hitting the market," Donabedian said. "All the computer manufacturers are going through their gyrations now, experimenting and trying to figure out the best way to go."