One of the most common statements made by IT vendors and "experts" about cloud-computing models is that somehow the economic crisis of the last few years is pushing enterprises to use cloud to save money on IT operations costs. Statements like the following are all too common:
The trend toward cloud computing, or Software-as-a-Service (SaaS), has accelerated during the economic crisis.
Or this one:
The fact that cloud computing is becoming a growing focus of attention right now seems to be no accident. "It is clear that the economic crisis is accelerating the adoption of cloud computing," explains Tommy Van Roye, IT-manager at Picanol.
Apparently, the idea is that tightening budgets have opened the minds of enterprises everywhere to the possibilities of cloud computing. That, in turn, seems to suggest that IT is somehow cheaper when run in cloud models.
That may or may not be the case, but I think the concept that the economic recession is driving interest in cloud is off the mark. What is driving enterprises to consider the cloud is ultimately the same thing that drives start-ups into the cloud: cash flow. Cash flow and the agility that comes from a more liquid "pay as you go" model.
The fact that the economy is recovering at the same time is just coincidence, in my opinion.
If you think about it, it makes perfect sense that no matter how much money an IT organization has in its coffers, or what the growth prospects are for its parent company or its industry, that organization is suffering from problems that have nothing to do with budget, and everything to do with a capital-intensive model.
In traditional IT, we build or buy an app--almost always with upfront licensing fees of some kind--then buy infrastructure on which to run the application. We then pay for a large percentage of the total cost of the application up front, and that investment is locked into the success or failure of that application.
Cloud computing changes that, however, in ways that have been stated by many over the last couple of years. The critical factor is that cloud computing can shift investment from the application perspective from a capital-intensive transaction to an ongoing operational expense.
This means that the age-old maxim of "you never get software right until the third major version" is much easier to swallow; try version one, fail, be out just the cost of compute time. Same with version two. Should version three be the first successful version, you aren't stuck with two sets of prior infrastructure that is now orphaned but still on the books.
(OK, I perhaps exaggerated reality a bit to make my point, but you get the picture. Without the upfront costs, IT experimentation is much, much more cost effective.)
This is equally true of private cloud as it is of public cloud, thought the former depends somewhat on the number of applications and services hosted in the cloud. The way I find it easiest to think about it is to see a company's private cloud as a competitor to the public cloud offerings that are out there. Then, from the business perspective, it's about cash flow in both the public and private cloud case.
The investment decision to build a private cloud then becomes a business modeling decision based on cost/benefit and risk analysis.
The thing I want you to walk away with after reading this post, however, is that cloud is less about squeezing out total cost of ownership, and much more about increasing the agility of IT, both technically and financially. I've even met CIOs who would be willing to pay more over time for cloud if they get an order of magnitude or more better agility--which several are actually seeing.
How about your organization? Are you exploring cloud to help get through a weak economy, or are you driving toward a more agile IT organization regardless of how well off your company might be now and in the future?