CNET también está disponible en español.

Ir a español

Don't show this again


IBM utility software on its way

Big Blue shows off its Tivoli Intelligent Orchestrator, designed to let groups of servers add computing power to jobs that need it or subtract it when demand drops.

IBM will release software at month's end to let customers start sampling some of the promised benefits of Big Blue's utility computing vision.

IBM has begun showing off the new product, called Tivoli Intelligent Orchestrator, which IBM acquired when it bought Think Dynamics in May. The software lets groups of servers automatically add computing power to jobs that need it or subtract it when demand dies down, said Sandy Carter, vice president of marketing for IBM's Tivoli management software group.

Intelligent Orchestrator is the foundation of a broader effort, code-named Project Symphony, which IBM will begin detailing later in September and October, Carter said. Symphony will include a broader range of options for customers, including bundles of hardware and software that incorporates Intelligent Orchestrator and, eventually, services for installing or running operations using the software.

IBM is showing the software that's in use by the US Open tennis championships, which is relying on pSeries Unix servers at IBM that switch back and forth between running the tennis match's Web site and performing cancer-related protein calculations for IBM research, Carter said. The Intelligent Orchestrator software is running on an Intel-based xSeries server that controls the Unix servers, she said.

The software addresses one of the fundamental promises of utility computing, known at IBM as "e-business on-demand" and elsewhere as "adaptive," "dynamic" or "organic" computing. In this vision, now-separate servers, storage and networking devices will be pooled together so that equipment will operate at closer to full capacity. The heavy demands of one important task can be met by allocating resources that had been assigned to another, lower-priority job.

Carter gives the example of a bank, whose computers might be devoted to stock trades in the morning, paycheck processing at lunch and early evening, and online banking at night.

The "utility" label comes from the idea that a customer will essentially be able to switch on computing power when needed without worrying about what happens behind the scenes, simply paying money for how much capacity and reliability is needed. The label grew popular before utilities' industry problems such as rolling blackouts in California and the August power outage in northeastern United States.

The utility computing idea is widely popular, with loud support from server makers IBM, Hewlett-Packard and Sun Microsystems, as well as software companies such as Veritas Software, BMC Software and Microsoft. But fulfilling the vision's promise is a distant goal.

"Dynamic data centers are a direction. They're not a reality," said Illuminata analyst Jonathan Eunice. Products available today from HP, Sun and IBM are at best at the version 1.0 or 1.5 level, he said.

But computing companies are hard at work. IBM advanced its schedule by acquiring Think Dynamics; Sun has made a similar move through its acquisitions of CenterRun, Pirus Networks and TerraSpring for its N1 utility computing plan. HP, meanwhile, relied on TerraSpring software for its Utility Data Center product, and Veritas acquired Jareva Technologies.

Right now, IBM is probably the leader, said Summit Strategies analyst Tom Kucharvy. "They've done as much to confuse the market as they have to educate it, but altogether, they've got more pieces than any of the other companies," he said.

And gradually, companies are moving from selling the grand vision of utility computing to selling bite-size pieces.

"Big Bangs don't work. We saw that with the old SAP rollouts," Eunice said, referring to the notoriously difficult installations of SAP accounting and inventory software that often required numerous consultants to redesign business processes and databases around the new software.

"One of the real advantages about utility computing is that yes, an overarching framework and road map is absolutely necessary, but customers can really implement it in terms of individual components," Kucharvy said.

Indeed, IBM is starting smaller with the Intelligent Orchestrator software. The product is priced based on how many servers it's controlling and costs between $20,000 and $50,000 to control 10 servers, which IBM believes will be an average-size installation.

The software can change a server's job by completely reinstalling its operating system and software, Carter said. In the case of IBM Unix servers or other products that can run several operating systems simultaneously in different partitions, the software can adjust the resources devoted to each task.

The software can be adopted gradually, running in an advisory mode that suggests administrator actions or takes action only when approved by administrators. When a customer gets more comfortable, the customer can let the software run the project fully automatically, Carter said.

Currently the software runs on IBM Intel and Unix servers, but by the end of 2003, it will run on zSeries mainframes and iSeries midrange servers, Carter said. IBM plans versions that will run on servers from Sun and others, as well.

Intelligent Orchestrator can control database software from IBM and Oracle and includes Java server software from IBM and BEA Systems.

The Orchestrator product can deal with directly attached or network-attached storage systems today, with support for storage area networks (SANs) due by year end. It also works with networking equipment from Cisco Systems, Lucent Technologies, Alteon WebSystems, APC and F5 Networks, with broader support due later in 2003 and the first quarter of 2004, IBM said.