Want CNET to notify you of price drops and the latest stories?

Will computing flow like electricity?

Industry executives respond to Nicholas G. Carr's contention that utility computing will evolve in much the same way electricity did a century ago.

Martin LaMonica Former Staff writer, CNET News
Martin LaMonica is a senior writer covering green tech and cutting-edge technologies. He joined CNET in 2002 to cover enterprise IT and Web development and was previously executive editor of IT publication InfoWorld.
Martin LaMonica
7 min read
As a provocateur, he's very effective. But as a prognosticator, people are less convinced.

Business writer Nicholas G. Carr raised many hackles in the information technology industry when he published a piece titled "IT Doesn't Matter" in 2003.

His latest piece with a similarly extreme headline, "The End of Corporate Computing," reopens the discussion of utility computing--the notion that corporations subscribe to computing services over the Internet much as they purchase electricity.


What's new:
Nicholas Carr, the author of the provocative article "IT Doesn't Matter," published another article, "The End of Corporate Computing," that predicts a large-scale shift to utility-like computing services.

Bottom line:
Industry executives queried by CNET News.com agreed that the computer industry will move to more hosted services over time. However, they see limitations to hosting and disagreed with the notion that the computing industry will evolve much as electricity did a century ago.

More stories on this topic

Yet Carr's latest article, published earlier this spring, failed to spark much industry soul-searching or a heated debate on the future of corporate computing.

IT executives queried by CNET News.com agreed that hosted services, or utility computing, will become more common and that corporations will take advantage of new technologies, such as Web services, grid computing and virtualization, to lower computing costs.

However, few executives envision a whole-scale transition to utility computing, even in the far-off future. None appeared to buy into Carr's assertion that the balance of power in the computing world could shift dramatically from technology infrastructure providers to Internet companies, such as Google or hosting companies.

For example, Charles Giancarlo, the chief technology officer of Cisco, downplayed the importance of utility computing scenarios. Like many others, Giancarlo said hosted services will become more important in certain situations but utility computing services will not be the norm in three to five years.

"We think (utility computing) makes sense for some small and medium-size businesses. But for large businesses, the decision to host applications outside or inside of the network depends on many different factors, including cost and network efficiency," said Giancarlo. "Some of the largest companies can run their own applications much cheaper and more efficiently than any utility computing provider."

"Overall, Carr has taken a very interesting analogy with some truth to it to an implausible extreme."
--Eric Newcomer, CTO, Iona Technologies

Other executives said that Carr's prediction that utility computing will become the industry norm is predicated on improper assumptions about the complexity of computing or blind spots in his knowledge.

In particular, Carr downplays the competitive advantage that custom-built software applications can bring, compared to hosted offerings, said Eric Newcomer, chief technology officer at software maker Iona Technologies.

"Computers do not work without software. And unlike electricity or other raw technology, software is designed for direct human interaction," Newcomer said. "Overall, Carr has taken a very interesting analogy with some truth to it to an implausible extreme."

Meanwhile, readers of CNET News.com, who responded to a news story on Carr's "End of Corporate Computing" piece, voiced a mix of opinions.

"The notion of the computer as computing device has been obsoleted by the Internet. All of the real action these days is in using the computer as a *communications* device," wrote one reader.

Others said that utility computing has yet to prove indispensable to corporate customers.

"I'd suspect that IBM's old mainframe philosophy is behind this drive to utility computing. It's very easy to bill by the month and provide premium services by the hour," wrote one reader. However, he questioned the need for these services: "I'm not aware of any pressing need that can only be met by Utility Computing."

Data centers obsolete?
To frame his discussion in the "End of Corporate Computing," Carr uses the analogy of the electricity industry and its own development over a century ago.

Carr argues that corporate computing data centers are analogous to private generators, which were used in the early days of electricity. These power sources burned fuel to generate electricity for a single site, such as a department store or a wealthy person's home. (Tycoon J.P. Morgan was the first residential customer in New York City in the late 1800s.)

But private and small-scale power generators, which used direct current, were eventually displaced entirely by alternating current technology, which allowed utilities

to send electricity over long distances, obviating the need for a local power plant and the people to run it.

To Carr, today's corporate data centers are the private power generators of old: inefficient, underutilized and too costly in the face of the network model of delivering IT services.

"As the technology matures and central distribution becomes possible, large-scale utility suppliers arise to displace the private providers. Although companies may take years to abandon their proprietary supply operations and all the sunk costs they represent, the savings offered by utilities eventually become too compelling to resist, even for the largest enterprises. Abandoning the old model becomes a competitive necessity," Carr wrote.

Nick Carr fires back
On Friday, the author of "The End of Corporate Computing" responded generally to IT executives' comments and criticisms.

"What we don't know is the ultimate shape of the IT utility model or the course of its development. That's what makes it so interesting--and so dangerous to current suppliers. What we do know is that the current model of private IT supply, where every company has to build and maintain its own IT power plant, is profoundly inefficient, requiring massively redundant investments in hardware, software and labor. Centralizing IT supply provides much more attractive economics, and as the necessary technologies for utility computing continue their rapid advance, the utility model will also advance. Smaller companies that lack economies of scale in their internal IT operations are currently the early adopters of the utility model, as they were for electric utilities....

"There are certainly tough challenges ahead for utility suppliers. Probably the biggest is establishing ironclad security for each individual client's data as hardware and software assets become shared. The security issue will require technological breakthroughs, and I have faith that the IT industry will achieve them, probably pretty quickly."

If technology and marketing investments are any indicator, many computing companies firmly agree that utility computing will become "too compelling to resist."

Starting in 2002 with the launch of IBM's On-Demand vision of more flexible computing, several vendors have gotten on the utility computing bandwagon. Sun used the name N1 to describe its data-center software, and Hewlett-Packard used the term Adaptive Enterprise.

However, initial efforts by both large and small technology providers--which are still in development--have primarily focused on infrastructure technology, rather than hosted services, to make corporate data centers more efficient.

Yet at the same time, there have been a growing number of Internet-delivered services aimed at corporations.

IBM offers hosted processing power and applications to companies, while Sun earlier this year launched its Sun Grid initiative where customers pay a flat-rate of $1 per hour per CPU, in a fee-for-service structure similar to those used by utility companies. Meanwhile, Salesforce.com and Google, which both deliver services via the Internet, were the two of the most high-profile stock market entrants last year.

Chief electricity officer?
But while utility computing is an enticing idea, holding up the electricity industry as the model for how computing should evolve doesn't sit right with all IT executives.

Peter Lee, CEO of grid software company DataSynapse, said that Carr's conclusion that the combination of virtualization, grid computing and Web services will result in utility computing is "100% spot-on." But he said the electricity industry analogy doesn't hold up entirely.

"We do not think the computing industry will eventually resemble the electricity industry as an exact parallel, because unlike electricity, there are many more variables in terms of computing power that would need to be standardized," Lee said. "Computing will, however, become much more utility-like, both in terms of pricing and in terms of on-demand power."

In his piece, Carr theorizes how the shift to utility computing could reshape the competitive forces in today's computing industry. He argues that leading "utility suppliers" of the future will either be today's large hardware providers, specialized hosting companies such as Digex, Internet outfits such as Google and Amazon, or as-yet-undiscovered start-ups.

Longtime computing industry executive Kim Polese, who is now CEO of open-source start-up SpikeSource, said that Carr's competitive analysis should figure in the effect of open source and offshore development from emerging markets, both of which are causing "huge disruptions."

"This means to me that we can't assume that competition will come from the usual places," Polese said. "The leaders of tomorrow may not even exist today, but they could grow offshore from start-up into sizable companies quickly given the strong demand for their services. The computing utility services may be arbitraged across a network of service providers, of various sizes, with pricing developed via dynamic price discovery."

Microsoft, meanwhile, is well positioned to take advantage of any move to hosted services, said Bob Muglia, senior vice president of Microsoft's Windows Server division.

"I think there will be a split. Companies will outsource things that can be very effectively run for an inexpensive price by others...On the other hand, I do think there will always be areas where people are putting in investment to drive business advantage that will either remain in-sourced or under very tight control of outsourcing--not purely hosted. There's a mixture of all these things," Muglia said. "We'll work well in both environments."

IBM's Ambuj Goyal, the general manager of IBM's Lotus division and former strategy executive in Big Blue's software group, fully buys into the notion of utility computing: He wrote a paper for IBM on the subject 10 years ago and offers hosted services for some Lotus products.

However, as with many discussions about the future, the reality will likely lie somewhere between extreme positions.

"Rather than take a 50,000-foot view...you need to get down to earth and look at individual cases," Goyal said. "A standardized utility model has a role, but what a business should do depends on each particular case."

CNET News.com's Marguerite Reardon contributed to this article.