Much has been touted about cloud computing's effects on the commoditization (and commodification) of IT resources and services. However, what the effects of that process would look like to the end user may not be what you think.
James Urquhart is a field technologist with almost 20 years of experience in distributed-systems development and deployment, focusing on service-oriented architectures, cloud computing, and virtualization. James is a market strategist for cloud computing at Cisco Systems and an adviser to EnStratus, though the opinions expressed here are strictly his own. He is a member of the CNET Blog Network and is not an employee of CNET.
One of my favorite bloggers (and long-time cloud pundit), Simon Wardley, once wrote a short post that clarified the meanings of two words that are key to understanding the value of cloud computing:
I thought I'd just re-iterate the distinction between [two] terms that was first identified by Douglas Rushkoff:-
Commodification (mid to late 1970s, Word) is used to describe the process by which something which does not have an economic value is assigned a value and hence how market values can replace other social values. It describes a modification of relationships, formerly untainted by commerce, into commercial relationships.
Commoditization (early to mid 1990s, Neologism) is the process by which goods that have economic value and are distinguishable in terms of attributes (uniqueness or brand) end up becoming simple commodities in the eyes of the market or consumers. It is the movement of a market from differentiated to undifferentiated price competition, from monopolistic to perfect competition.
You should definitely read the rest of Wardley's post to get a clear sense of where each word applies, but I wanted to make sure you understood these two concepts because there are some interesting debates about how commoditizaton and commodification apply to cloud computing.
On the one hand, you have those who believe that cloud computing means the end of infrastructure differentiation for at least enterprise software, if not all network-based software in general. The theory goes something like this:
If cloud computing is driven to offer core capabilities (such as raw server capacity or even java application execution) to as wide a market as possible,
and if that market is driven to build applications that can be ported between cloud offerings as easily as possible, and executed as cheaply as possible,
then infrastructure will be driven to a common set of architecture and operations standards which will eliminate differentiation in terms of those core capabilities.
Contrast this with the view that commoditization in the cloud will happen at a much more granular level--i.e. that technologies will standardize various components and features, but that services themselves can and will be differentiated and often add significant value over those base capabilities.
This theory sounds more like:
If applications require certain elements of the compute environment to be commodity with well-defined interfaces,
and if the cloud market as a whole is seeking innovative services that add value to basic computing capabilities,
This vision is more of a "push to the bottom" approach where the best innovative services will be commodified and commoditized over time, while new capabilities continue to be developed on top of that base. I'm much more comfortable with this concept, though some may call me biased because I work for a major systems vendor, Cisco Systems.
However, and much more importantly, service providers are much more comfortable with the latter vision as well. Take a look at Amazon Web Services, for instance. In addition to EC2 (arguably a move toward commoditization of computing), Elastic Block Storage and S3 (two ways of commodifying compute storage volume), Amazon offers an array of value add services to aid developers and operators working with their services. For example, there is CloudFront, DevPay, Simple Queue Service, and so on.
A developer working in Amazon today that uses these services directly will find that they are using a combination of relatively portable "near-commodity" services and others that are truly unique (at least in implementation) to Amazon.
The second theory allows cloud providers to innovate on features as well as service, and possibly offer those features at a premium. If the market values the innovation, they will buy. If they don't, they will ignore it, and the service provider will either drop the price for the feature or drop the feature itself.
That's not to say that the two theories couldn't play out side-by-side as well. With cloud service brokers in the mix, there might be those that offer value add services that leverage pure commodity services from others, for example. There may also be a market of developers willing to architect around "least common denominator" commodity infrastructure services in order to save significant money per compute cycle and/or storage bit.
The key thing here is that the market will make the decision, but there are trade-offs that have yet to be evaluated by the vast majority of the market. To commit to an outcome now is like declaring a winner in a tightly contested race with 5 percent of the vote counted.
I'd love to hear what you think cloud computing will do to the commoditization (and, perhaps, the commodification) of information technologies. And, yes, "cloud computing will fail" is an option, but if that's your stance, tell me why cloud fails, and how commoditization and/or commodification affected that failure. I look forward to the your thoughts.