X

There is no 'Big Switch' for cloud computing

Network-centric computing is here to stay, but "cloud computing" as the electric grid is something for a distant future.

Gordon Haff
Gordon Haff is Red Hat's cloud evangelist although the opinions expressed here are strictly his own. He's focused on enterprise IT, especially cloud computing. However, Gordon writes about a wide range of topics whether they relate to the way too many hours he spends traveling or his longtime interest in photography.
Gordon Haff
4 min read

By now, most people involved with IT are familiar with at least the broad outlines of cloud computing--the idea that applications run somewhere out in the network. We just get back data streams or Web pages; the actual crunching, connecting, and correlating happens somewhere else.

Plenty of people, including myself, have taken cuts at defining cloud computing with a bit more rigor. I've come to believe that particular exercise can still be useful for thinking about different use cases and different market segments, but I don't expect we'll ever see a canonical definition. Too many people have too many different perspectives--and particular interests in having some aspects, say "private clouds," be viewed in a particular way.

However, specifics of the cloud-computing taxonomy aside, it's worth noting that the vision of cloud computing, as originally broached by its popularizers, wasn't just about more loosely coupled applications being delivered over networks in more standardized and interoperable ways--a sort of next-generation service-oriented architecture, if you would. Rather, that vision was about a fundamental change to the economics of computing.

As recounted by, among others, Nick Carr in his The Big Switch, cloud computing metaphorically mirrors the evolution of power generation and distribution. Industrial-revolution factories--such as those that once occupied many of the riverside brick buildings I overlook from my Nashua, N.H., office--built largely customized systems to run looms and other automated tools, powered by water and other sources.

These power generation and distribution systems were a competitive differentiator; the more power you had, the more machines you could run, and the more you could produce for sale. Today, by contrast, power (in the form of electricity) is just a commodity for most companies--something that they pull off the grid and pay for based on how much they use.

Some companies may indeed generate power in a small way--typically as backup in outages or as part of a co-generation setup--but you'll find little argument that mainstream power requirements are best met by the electric utility. The Big Switch argues that computing is on a similar trajectory.

And that posits cloud computing being a much more fundamentally disruptive economic model than a mostly gradual shift toward software being delivered as a service and IT being incrementally outsourced to larger IT organizations. It posits having the five "computers" (which is to say complexes of computers) in the world that Sun CTO Greg Papadopoulos hyperbolically referred to--or at least far, far fewer organizations doing computing than today.

Such an IT landscape would look very different--profoundly affecting, just for a start, any vendor competing in it. And that's without even discussing all the regulatory, privacy, and control of information issues that would assume great prominence.

It's an intriguing and big argument, and one well told. I've also come to think it's mostly wrong--at least for any time values that we care about as a practical manner.

I'm emphatically not arguing against cloud computing in the small-"c" sense. Computing is getting more network-centric. Check. Less tied to the physical infrastructure it was initially installed on. Check. More dynamic. Check. More modular. Check. And so forth. Check. Check. Check.

In fact, I even expect that we will see a pretty large-scale shift among small and medium businesses away from running their own e-mail systems and other applications. As we've already seen among consumers--Google search and applications and Web 2.0 sites are all aspects of cloud computing.

And there are economically interesting aspects to this change. No longer do you need to roll in (and finance) pallets of computers to jump-start a company; you go to the Web site for Amazon Web Services. One implication is lower barriers to entry for many types of businesses.

But that's not the sort of near-term economic shift that the electric grid brought about. Rather, it made both unnecessary and obsolete the homegrown systems of the enterprises of the day. And it did so relatively quickly.

And that is what I don't see happening any time soon, on a substantial scale with cloud computing. So far, there is scant evidence that, once you reach the size of industrialized data center operations (call it a couple of data centers to take care of redundancy), the operational economics associated with an order of magnitude greater scale are compelling.

Doubtless, megaservice providers have some scale advantages--especially if they get to choose a narrow slice of application types on which they choose to concentrate. But consider all the reasons enterprises might want or need to continue to run applications in-house: control and visibility, compliance, integration with legacy IT, and so forth. It will take a lot of financial incentive to overcome those countervailing factors.

I'm skeptical that incentive is there for core enterprise IT for organizations with at least modest scale points of their own.