One of the really difficult aspects of cloud computing for most established IT organizations is the fact that the move to clouds, even private clouds, is not a simple, intuitive one. Replacing the bulk of both technology and process with a focus on capacity as a service--an automated, self-administered service--results in many organizations "experimenting" with the cloud, but few pushing any barriers. To make matters worse, we are in that wonderful "discovery" phase of a technology, where there are few if any guides to how to do it right, with minimal risk, and those that do exist are generally personal opinions, not "burned in" recipes for success.
This post does not pretend to be such a recipe. However, over the course of the last several months, culminating in some great conversations with some really smart people the last few weeks, I've come to realize that there is a basic maturity model for moving from data center consolidation architectures to true open market cloud architectures.
Remember maturity models? They've been around for some time, but a couple of years ago there was a small burst of creativity among system integrators and analysts alike, and maturity models were defined for a variety of IT subjects, ranging from business processes to technology architectures, such as SOA. The basic idea was to lay out some milestones, or even "gateways", to be achieved by IT as they worked towards achieving some idealized computing or process goal.
To that end, below is a simple five phase maturity model that I and others believe describes the stages of evolution for an enterprise data center trying to achieve cloud Nirvana:
At a very high level, each step of the model breaks down like this:
- Consolidation: is achieved as data centers discover ways to reduce redundancy and wasted space and equipment by measured planning of both architecture (including facilities allocation and design) and process.
- Abstraction occurs when data centers decouple the workloads and payloads of their data center infrastructure from the physical infrastructure itself, and manage to the abstraction instead of the infrastructure.
- Automation comes into play when data centers systematically remove manual labor requirements for run time operation of the data center.
- Utility is the stage at which data centers introduce the concepts of self-service and metering.
- Market is achieved when utilities can be brought together over over the Internet to create an open competitive marketplace for IT capabilities (an "Inter-cloud", so to speak).
Virtualization is actually a tool for consolidation, and it is probably its most applied function at this point, but consolidation can be achieved other ways, including storage consolidation and use of denser compute hardware (e.g. blades and dense switches). Consolidation certainly provides return on investment in most cases, but it doesn't really add anything to the ease in which hardware is provisioned and assigned.
Server virtualization is certainly a starting point for abstraction, but unless used specifically as the target representation of workload in the data center, and coupled with things like live motion and storage virtualization, it doesn't go far enough. Network abstractions have been around for a while (e.g. VLANs), but they have frequently been underused as an abstraction, instead focused on a static method of consolidation. I think commercial clouds today are almost all providing abstraction as a service, and some move somewhat beyond that. Unfortunately, while abstraction enables dynamism, it doesn't maximize impact on system administrator productivity.
Provisioning automation has been around for a while, but real maturity in automation requires things like pooling for rapid reallocation of compute resources, run time response to capacity demands, trouble ticket response automation (or elimination of trouble tickets for most automated response scenarios), and integrated system management and measurement. While automation can greatly reduce your operational expenses in the data center, there is much more that an IT organization can do to align operations to business needs.
Creating a utility requires IT to get out of the way of the business units looking to allocate IT resources for various initiatives and core business requirements; in other words, to allow businesses to serve themselves. However, in order for self-service to work for the overall enterprise, the business units need feedback as to the cost of those resources; this is why metrics are so important to the utility model. Once both self-service and metering appear and are used consistently throughout IT, the data center starts looking like a true utility--though a very monopolistic one.
Achieving an open marketplace is essentially cloud computing nirvana, and the ideal to which most enterprises should logically strive to achieve. Cloud providers, on the other hand, are likely to want to hold this off for a while, as it will create commodity markets for certain types of computing resources. In the end, I believe this open market is inevitable, as the economics are just too powerful.
From the evidence I can find, it appears that most competent IT organizations have managed some form of consolidation in the last five years, and many are well on their way to having a handle on how to use abstraction to add value to IT. Many forward thinking organizations are already experimenting or even implementing fairly comprehensive automation for scalable workloads, with some even going farther with plans to automate all infrastructure. Almost no one but cloud providers (and a few dev labs) are at Utility (and I should point out that just having a portal where users can request capacity to be manually allocated later doesn't count.) A true cloud marketplace is but a gleam in any technologists eye at this point.
Let me know what you think about all of this. Where do you feel your organization is with respect to cloud maturity? Are you eager to climb the maturity ladder, or does it just not matter to you or your organization?