That's exactly why animation studio Threshold Digital Research Labs signed up IBM to handle the labor-intensive job of computer-rendering animation images. Instead of Threshold doing the work in-house, Big Blue does it using up to 2,000 server processors at its data center in Poughkeepsie, N.Y.
Under the deal, Santa Monica, Calif.-based Threshold pays only for the computing power it uses. This means the studio can get access to extra computing power during peak times, rather than being restricted by the number of machines it has on premises--saving the company valuable production time and lowering the risk carried by the financiers bankrolling its feature-length movies.
Bottom line: For more info:
Threshold Digital Research Labs contracts with IBM to handle the computing-intensive job of rendering animation images.
Threshold says it is saving 20 percent on its IT costs, but pay-as-you-go pricing will not be widely popular without billing systems that customers can understand and find reasonable.
Track the players
For more info:
The arrangement is an example of, an approach that is just starting to gain momentum. In utility computing, suppliers pipe applications and processing power across the Internet to customers, who pay a monthly charge depending on how much they consume, just as they do when they purchase water, gas or electricity. Proponents say the idea can fundamentally change the way corporations do business, giving them the flexibility to buy computing capacity in reaction to spikes in demand.
But before utility computing can go mainstream, the computing industry must address a basic question that has nothing to do with the technology itself: How should customers be charged for it?
Right now, businesses can sign contracts for hosted services, but the pricing of those services is generally worked out case by case. Until usage-based pricing mechanisms are worked out and clearly explained to customers, the buzz around utility computing may remain more hype than substance in the information technology market.
The goal of utility computing, according to the companies providing it, is to allow businesses to become more efficient by helping them make better use of their gear and services. Ideally, businesses will save money by buying only what they need, rather than investing in underused equipment and personnel.
In this scenario, pay-as-you-go pricing could make costs more predictable and reduce large expenditures for a computing-power user. However, even the best products and services must have a pricing system that customers can understand and find reasonable, if they are to succeed in generating business.
Get with utility
CIOs must plot a three-pronged strategy to get
the best out of utility, says Forrester.
"The end result is that you want a pricing model that can handle a variable demand. That's the whole theory of utility computing," said Tom Rhinelander, an analyst at New Rowley Group. "But you don't necessarily want variable pricing. You want to know what it will cost you."
Making matters worse for customers is confusion among suppliers over just what utility computing is. The industry's largest hardware providers--IBM, Hewlett-Packard and Sun Microsystems--have taken the lead in promoting the approach, but all use conflicting terminology and define their services differently.
Pricing the package
The central idea of utility computing--outsourced services--is well established and widely understood. But the addition of newer management technologies for servers, storage, networking and software into the mix requires the computing industry to reconsider its traditional pricing models.
"What we're seeing now is a lot of experimentation and reaching out to customers," Madden said. "It's a real investigation process."
IBM, which once ran advertisements featuring a stove to convey the idea of computing on tap, says that delivering services over a network--like a utility--is only one facet of its.
As for HP, the company's research labs are modeling their services directly on the utilities business, drafting something called a "computon," which HP hopes will be the computing industry's equivalent to the kilowatt-hour. The idea is to have a generic unit that combines measurements of server processing power, storage and memory. The Palo Alto, Calif.-based company expects to submit the computon as a suggested industrywide measurement to a standards organization within a year.
Big Blue, based in Armonk, N.Y., is preparing to announce a package of products and services to bolster its on-demand computing lineup for businesses. Customers can purchase IBM'sto run in-house, contract for IBM to manage their data centers on site, or purchase outsourced services.
"IBM is trying to offer lots of flexible options; customers don't have to fit into a pigeonhole," said Audrey Rasmussen, an analyst at Enterprise Management Associates.
HP, too, wants to offer customers a range of utility computing options, from usage-based pricing for servers and storage, to full-scale hosted applications that carry per-transaction fees.
Usage-based payments are already standard options on HP's servers and storage devices designed to run in corporate data centers, according to Nick van der Zweep, the director of utility computing at HP. Purchasing computing a la carte is a notion central to HP's(UDC) product, which is being tried out by large corporations and telecommunications service providers, he said.
UDC software creates virtual pools of computing resources--such as servers and storage devices--that can be shared by several applications. The virtualization and automated-provisioning features in UDC are designed to allow a company to automatically fire up extra servers and storage machines when needed--for example, to ensure that applications meet a predefined performance threshold.
Sun hasn't laid out a distinct utility computing pricing plan. The company is focusing on developing its. Sun says N1 can unite groups of computing, storage and networking equipment into a single resource that's easy to manage, efficient to use and quick to adjust to changing needs.
The Santa Clara, Calif. company intends to sell N1--along with hardware and software to build shared computing, or "grid," systems--to professional services companies which, in turn, will offer utility computing services to customers.
Danish toy maker Lego is one of the most recent examples of a company using flexible computing to compete better in a cyclical industry. Central to Lego's decision to sign an with IBM was a provision that lets the toy maker purchase server processing capacity in increments, so it can meet an anticipated surge in activity around the holiday-shopping season, according to Hal Yarbrough, Lego's senior director of global information technology.
That kind of arrangement seems ideal for customers, such as Threshold's Johnsen, who work in chaotic industries. While Johnsen acknowledged that utility computing has its share of problems, he said the logic behind the concept makes too much sense for it to fail.
"The service is not in the retail stage--it's still in development," Johnsen said. "But the promise is huge."