A Bear's Face on Mars Blake Lively's New Role Recognizing a Stroke Data Privacy Day Easy Chocolate Cake Recipe Peacock Discount Dead Space Remake Mental Health Exercises
Want CNET to notify you of price drops and the latest stories?
No, thank you

Utility's last snag: The price tag

Before utility computing can go mainstream, providers have to get to grips with a basic question: How should they calculate the charges?

Moviemaking is a risky business by any measure, so it makes sense for film studios to avoid spending money on production and equipment whenever they can.

That's exactly why animation studio Threshold Digital Research Labs signed up IBM to handle the labor-intensive job of computer-rendering animation images. Instead of Threshold doing the work in-house, Big Blue does it using up to 2,000 server processors at its data center in Poughkeepsie, N.Y.

Under the deal, Santa Monica, Calif.-based Threshold pays only for the computing power it uses. This means the studio can get access to extra computing power during peak times, rather than being restricted by the number of machines it has on premises--saving the company valuable production time and lowering the risk carried by the financiers bankrolling its feature-length movies.


What's new:
Threshold Digital Research Labs contracts with IBM to handle the computing-intensive job of rendering animation images.

Bottom line:
Threshold says it is saving 20 percent on its IT costs, but pay-as-you-go pricing will not be widely popular without billing systems that customers can understand and find reasonable.

For more info:
Track the players

"To do all the processing on a local basis just doesn't make sense," said George Johnsen, chief technology/animation officer at Threshold, who estimated that his company has saved about 20 percent with the pay-as-you-go deal. This deal "really allows me to change my business and take the handcuffs off on how I can plan a show."

The arrangement is an example of utility computing, an approach that is just starting to gain momentum. In utility computing, suppliers pipe applications and processing power across the Internet to customers, who pay a monthly charge depending on how much they consume, just as they do when they purchase water, gas or electricity. Proponents say the idea can fundamentally change the way corporations do business, giving them the flexibility to buy computing capacity in reaction to spikes in demand.

But before utility computing can go mainstream, the computing industry must address a basic question that has nothing to do with the technology itself: How should customers be charged for it?

Right now, businesses can sign contracts for hosted services, but the pricing of those services is generally worked out case by case. Until usage-based pricing mechanisms are worked out and clearly explained to customers, the buzz around utility computing may remain more hype than substance in the information technology market.

"Vendors, if they really want their 'on-demand,' or utility, computing strategies to take off, they need to have a complete story," said John Madden, an analyst at Summit Strategies. "They're realizing that if they come up with new ways to use IT, they have to come up with new ways to pay for IT."

The goal of utility computing, according to the companies providing it, is to allow businesses to become more efficient by helping them make better use of their gear and services. Ideally, businesses will save money by buying only what they need, rather than investing in underused equipment and personnel.

In this scenario, pay-as-you-go pricing could make costs more predictable and reduce large expenditures for a computing-power user. However, even the best products and services must have a pricing system that customers can understand and find reasonable, if they are to succeed in generating business.

Get with utility

CIOs must plot a three-pronged strategy to get
the best out of utility, says Forrester.

The first usage-based purchasing plans, launched by Hewlett-Packard and others, have yet to pull in many customers--a situation that is raising concerns over whether buyers are uncomfortable with the lack of a set price. Although the cost of utility computing is supposed to be as simple to understand as a monthly bill, too many pricing variables could make predicting payments complicated for corporate customers.

"The end result is that you want a pricing model that can handle a variable demand. That's the whole theory of utility computing," said Tom Rhinelander, an analyst at New Rowley Group. "But you don't necessarily want variable pricing. You want to know what it will cost you."

Making matters worse for customers is confusion among suppliers over just what utility computing is. The industry's largest hardware providers--IBM, Hewlett-Packard and Sun Microsystems--have taken the lead in promoting the approach, but all use conflicting terminology and define their services differently.

Pricing the package
The central idea of utility computing--outsourced services--is well established and widely understood. But the addition of newer management technologies for servers, storage, networking and software into the mix requires the computing industry to reconsider its traditional pricing models.

"What we're seeing now is a lot of experimentation and reaching out to customers," Madden said. "It's a real investigation process."

IBM, which once ran advertisements featuring a stove to convey the idea of computing on tap, says that delivering services over a network--like a utility--is only one facet of its on-demand computing initiative.

As for HP, the company's research labs are modeling their services directly on the utilities business, drafting something called a "computon," which HP hopes will be the computing industry's equivalent to the kilowatt-hour. The idea is to have a generic unit that combines measurements of server processing power, storage and memory. The Palo Alto, Calif.-based company expects to submit the computon as a suggested industrywide measurement to a standards organization within a year.

Some hardware companies have designed usage-based purchasing plans that, while not true per-hour utility pricing, do allow customers to tap processor power based on changing needs. HP, IBM and Sun each offer customers the option of expanding the capacity of servers they run on-site, paying more for the additional usage.

Big Blue, based in Armonk, N.Y., is preparing to announce a package of products and services to bolster its on-demand computing lineup for businesses. Customers can purchase IBM's software to run in-house, contract for IBM to manage their data centers on site, or purchase outsourced services.

"IBM is trying to offer lots of flexible options; customers don't have to fit into a pigeonhole," said Audrey Rasmussen, an analyst at Enterprise Management Associates.

HP, too, wants to offer customers a range of utility computing options, from usage-based pricing for servers and storage, to full-scale hosted applications that carry per-transaction fees.

Usage-based payments are already standard options on HP's servers and storage devices designed to run in corporate data centers, according to Nick van der Zweep, the director of utility computing at HP. Purchasing computing a la carte is a notion central to HP's Utility Data Center (UDC) product, which is being tried out by large corporations and telecommunications service providers, he said.

UDC software creates virtual pools of computing resources--such as servers and storage devices--that can be shared by several applications. The virtualization and automated-provisioning features in UDC are designed to allow a company to automatically fire up extra servers and storage machines when needed--for example, to ensure that applications meet a predefined performance threshold.

"All the vendors are basically taking their new systems management software and turning around and using them to be able to offer hosted services in a new financial arrangement," said Amy Wohl, the president of research firm Wohl and Associates. "This can be very appealing to customers, who would prefer an expense item rather than a capital expenditure because they can spread out the payments and save money."

Sun hasn't laid out a distinct utility computing pricing plan. The company is focusing on developing its N1 software. Sun says N1 can unite groups of computing, storage and networking equipment into a single resource that's easy to manage, efficient to use and quick to adjust to changing needs.

The Santa Clara, Calif. company intends to sell N1--along with hardware and software to build shared computing, or "grid," systems--to professional services companies which, in turn, will offer utility computing services to customers.

Danish toy maker Lego is one of the most recent examples of a company using flexible computing to compete better in a cyclical industry. Central to Lego's decision to sign an on-demand computing deal with IBM was a provision that lets the toy maker purchase server processing capacity in increments, so it can meet an anticipated surge in activity around the holiday-shopping season, according to Hal Yarbrough, Lego's senior director of global information technology.

That kind of arrangement seems ideal for customers, such as Threshold's Johnsen, who work in chaotic industries. While Johnsen acknowledged that utility computing has its share of problems, he said the logic behind the concept makes too much sense for it to fail.

"The service is not in the retail stage--it's still in development," Johnsen said. "But the promise is huge."