X

Capacity aggregation: Cloud's great experiment

One of the interesting things about cloud computing as a disruption to "traditional IT" is the experimentation and innovation it encourages. Take the new cloud aggregators.

James Urquhart
James Urquhart is a field technologist with almost 20 years of experience in distributed-systems development and deployment, focusing on service-oriented architectures, cloud computing, and virtualization. James is a market strategist for cloud computing at Cisco Systems and an adviser to EnStratus, though the opinions expressed here are strictly his own. He is a member of the CNET Blog Network and is not an employee of CNET.
James Urquhart
3 min read

In my last post, I gave you an outline of what I see as the three biggest "killer apps" of cloud computing. There is, however, another facet to the cloud story that I think is very exciting right now: innovation on the core technical and operational models that form the basis of distributed computing.

Wikimedia Commons

What I mean by that is this: cloud has made new ways of acquiring and consuming infrastructure, platforms, and applications readily available to an increasingly broad market of potential users. The financial model--pay-as-you-go--makes failure much, much cheaper than it was with models in which the application owner had to lay out large amounts of capital up front to have somewhere to run their application.

That ease of access and experimentation makes cloud a new tool in the toolbox of technologists. And, as in any craft where useful new tools are introduced, those technologists are now trying to see if they can solve new problems that weren't possible before. Today, the cloud is a place where the so-called envelope is being pushed to new extremes.

One of the most important of these experiments today is the introduction of true compute capacity aggregators--market services where capacity is available on demand, from multiple providers, with price and service competition.

Achieving a true capacity market, in which capacity can be traded as a commodity product like wheat or energy is an extremely difficult problem to solve. In fact, I'm on record as saying it will be many years before the technical and legal barriers to such a model will be removed.

However, I may be proven wrong, if services like Enomaly's SpotCloud, ScaleUp's Cloud Management Platform (specifically it's new federation features), and stealth start-up ComputeNext (outlined by CloudAve blogger Krishnan Subraramian) have their way. These services aim to make the acquisition of compute capacity consistent across multiple sources, which is the beginning of an exchange market model.

The overall model is simple: those with capacity make it available to the service (though how that is done seems to vary by offering), and those that need capacity come to the service, find what they need, and consume it. SpotCloud is the most mature--you can play with it today--with the others coming online over the coming months, it appears.

The questions these experimental models hope to answer is two-fold. First, what model will the compute exchange market take? Both SpotCloud and ScaleUp take online travel industry models. (SpotCloud is modeled somewhat after travel clearinghouse Hotwire, and ScaleUp after aggregators like Orbitz or Travelocity.) According to Subraramian's post, ComputeNext is taking more of a search engine model, though how they monetize that is unclear.

Second, how does one run various kinds of applications in what is almost inherently a transient infrastructure model? Given the fact that there is little guarantee that any given capacity will be available on a long-term basis, what types of applications can consume it today, and what kinds of innovations will expand that target market?

SpotCloud, in fact, forces this question, as its capacity is transient by definition (though they recently added instance renewal recently). So, the question becomes, is it a limited tool, or will some software developer create new management tools that run a distributed, "fail ready" application on transient infrastructure, creating new instances to replace expired instances when required without losing performance or availability?

By the way, there is no guarantee that these aggregators will be the source of compute exchanges. Other application-level management tools, such as enStratus (disclaimer: I am an adviser) and RightScale could handle capacity evaluation and acquisition in the application management plane itself, rather than as an online service consumed by the application management tools.

However, the existence of aggregators is one model that has to be explored before we can pick a utility "standard."

There are many people who believe that some large portion of compute capacity will be provided in a utility model in the future. Are the early cloud aggregators of today the path to that vision? I'm not sure, but I can't wait to see how these experiments turn out.