X

IBM expands flexible computing plan

Big Blue plans to add muscle to its on-demand computing push Thursday, with the debut of several efforts designed to help it defend its position of power in the tech world.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
5 min read
IBM adds muscle to its on-demand computing push Thursday, with the debut of several efforts designed to help it defend its position of power in the technology world.

The new efforts span Big Blue's storage, software and server lines and dovetail with the on-demand initiative to emphasize products and pricing strategies that can adapt to fluctuations in computing demand. The company is spending $10 billion on developing and publicizing the push, according to IBM.

The Thursday launch includes a service that lets customers tap into varying amounts of server power, depending on changes in their demand, said IBM. It will also display a "virtualization" technology that pools storage systems, so more of their total capacity can be used, and software that automatically changes the tasks assigned to low-end servers.

Also on the menu is a revamp of IBM's WebSphere business software to allow it to be controlled by Big Blue's "grid" software, which was originally developed to unite groups of computers into a supercomputer.

Topping the product and service debuts is the introduction of a major overhaul to IBM's pricing scheme, called the Open Infrastructure Offering. OIO is a customized contract under which customers pay for their entire collection of computing hardware, software and services with a single fixed monthly payment.

IBM's on-demand plan is one instance of the utility computing trend that's sweeping the industry. In the ultimate vision of utility computing, companies pay for computing capacity as they use it, the way they pay for electricity today. In the nearer term, utility computing often means they can fire up a server's unused processors as needed or tap into an IBM data center to accommodate spikes in demand.

Money matters
Utility computing is closely related to two other concepts that make it easier to treat computing gear as one gigantic pool of computing power: automation that lets computers manage themselves, and virtualization that shields software from the underlying infrastructure it's running on.

OIO cuts costs overall and permits customers to count on upgrades without worrying about what specific gear they'll need to buy in the future, said Mark Shearer, vice president of IBM's server products unit.

"We take everything we do at IBM--hardware, storage, software, maintenance, disaster recovery, global financing--and we work with clients to put it all together so they have a single monthly payment over a period of years for their information technology infrastructure," Shearer said. OIO also can accommodate sudden jumps in computing needs, a feature that's a key component of IBM's overall on-demand push.

Electronic payment specialist TSys is among the customers who have negotiated OIO contracts, Shearer said.

The plans show a complete turnaround from the arrogant bureaucracy IBM had become in the 1980s and has been working for years to change, said Robert Frances Group analyst Ed Broderick, who is attending an analyst conference in Palisades, N.Y., on the IBM plans.

"This is a new IBM I'm hearing about today than three years ago," Broderick said. "They are getting their act together on hardware, software and services. IBM Global Services and Global Financing are doing some crazy, wild stuff. IBM's easier to do business with. They're being more responsive to customer-driven requirements."

OIO, for example, makes life easier for computing staff tired of constantly fighting to get computing purchases approved by upper management, Broderick said. "You know what your costs are. It provides budgetary sanity."

IBM rivals Sun Microsystems and Hewlett-Packard have begun their own utility, virtualization and self-managing infrastructure efforts.

Switching on the off switch
IBM has offered the ability to increase the computing capacity of its mainframes for some time and has spread it to its other servers in recent years. Thursday, it will announce similar plans coming later in 2003 for its high-end Shark storage system and its BladeCenter chassis that houses as many as 14 dual-processor Intel-based servers.

Shark will ship with as much as 6.9 terabytes of unused disk space that customers can switch on and pay for as needed. And they'll be able to buy BladeCenters with seven of the 14 blades idle, paying as they're turned on within a six-month period.

Another major change in IBM's utility offering will be the ability to turn off capacity in some computers when it's not needed anymore, Shearer said. IBM offers this feature in its midrange iSeries line, and in coming weeks Big Blue will offer it in its pSeries Unix servers and zSeries mainframes as well, he said.

At first blush, this "on-off capacity on demand" may seem like a minor adjustment to the strategy, but Broderick said it shows a dramatic change. IBM in the past saw capacity-on-demand features as a way for customers to upgrade, but Big Blue didn't want to allow them the option of switching off a processor and ratcheting payments to IBM back down accordingly.

"I never thought IBM would allow you turn it off. You could always go up, but you could never go down," Broderick said. "This is a very loud embodiment of utility computing. You pay for what you use, when you use it."

In a related move, IBM will release an automation product in the third quarter called Web server provisioning, which automates the process of installing software on a new server or one being assigned a different task.

Virtualization crops up in IBM's WebSphere and storage products.

WebSphere has been retooled so applications running on it can be controlled by the Open Grid Services Architecture, the supercomputing grid software that's been adapted with Web services technology from the business computing realm. The new WebSphere, available later this quarter, will make it easier to run WebSphere applications on groups of servers.

IBM also discussed plans for its TotalStorage Virtualization family of products, which pools storage systems together without ruffling the feathers of the servers tapping into that storage space. Eliminating isolated storage systems means it's less likely that unused storage space will go to waste.

Much of the virtualization technology comes from IBM's Storage Tank project. The full-fledged Storage Tank technology will be available in December.

On-demand computing is one of the central initiatives of new Chief Executive Sam Palmisano.

"Customers no longer think about computing as a collection of piece parts," Palmisano told shareholders at the company's annual meeting Tuesday. "In the past, customers used technology to automate standalone operations like payroll and inventory control...Today, customers want to use technology to pull those standalone operations into a unified whole."