Bargains for Under $25 HP Envy 34 All-in-One PC Review Best Fitbits T-Mobile Data Breach Settlement ExpressVPN Review Best Buy Anniversary Sale Healthy Meal Delivery Orville 'Out Star Treks' Star Trek
Want CNET to notify you of price drops and the latest stories?
No, thank you

How evolution begat the cloud revolution

Cloud computing is a revolutionary IT trend. But, as with many such revolutions, it comes about from the convergence of a number of largely evolutionary threads of change.

Asking why cloud computing is happening today is something of a tautology. That's because an inclusive definition of cloud computing essentially equates it with a broad swath of the major advances happening in how IT is operated and delivered today.

Pervasive virtualization, fast application and service provisioning, elastic response to load changes, low-touch management, network-centric access, and the ability to move workloads from one location to another are all hallmarks of cloud computing. In other words, cloud computing is more of a shorthand for the "interesting stuff going on in IT" than it is a specific technology or approach.

Archaeopteryx is widely considered to be the first bird but it actually had more in common with theropod dinosaurs than with modern birds. H. Raab/CC Wikimedia

But that doesn't make the question meaningless. It would be hard to argue that there isn't a huge amount of excitement (and, yes, hype) around changing the way that we operate data centers, access applications, and deploy new services. So forget the cloud computing moniker if you will. Why is this broad-based rush to do things differently happening right now?

The answer lies in how largely evolutionary trends can, given the right circumstances, come together in a way that results in something that's quite revolutionary.

Take the Internet. The first ARPANET link--the Internet's predecessor--dates to 1969. Something akin to hypertext was first described by Vannevar Bush in a 1945 article and Apple shipped Hypercard in 1984. But it took the convergence of things like inexpensive personal computers with graphical user interfaces, faster and more standardized networking, the rise of scale-out servers, the World Wide Web, the Mosaic browser, open source software like Linux and Apache, and the start-up culture of Silicon Valley to usher in the Internet as we know it today. And that convergence, once it began, happened quite quickly and dramatically.

The same could be said of cloud computing. The following interrelated trends are among those converging to make cloud computing possible.

Comfort level with and maturation of mainstream server virtualization. Virtualization serves as the foundation for several types of cloud computing including public Infrastructure-as-a-Service clouds like Amazon's and most private cloud implementations. So, in this respect, mature server virtualization software is a prerequisite for cloud computing. But the connection goes beyond technology. Increasingly ubiquitous virtualization has required that users get comfortable with the idea that they don't know exactly where there applications are physically running. Cloud computing is even more dependent on accepting a layer of abstraction between software and its hardware infrastructure.

The build out of vendor and software ecosystem alongside and on top of virtualization. From a technology perspective, cloud computing is about the layering of automation tools, including, over time, those for policy-based administration and self-service management. From this perspective, cloud computing is the logical outgrowth of virtualization-based services or--put another way--the layering of resource abstraction on top of the hardware abstraction that virtualization provides. Cloud computing can also involve concepts like pay-per-use pricing, but these too have existed in various forms in earlier generations of computing.

Browser-based application access. The flip side of mobile workloads is mobility of access devices. Many enterprise applications historically depended on the use of specific client software. (In this respect, client-server and then PCs represented something of a step back relative to applications accessed with just a green-screen terminal.) The trend towards being able to access applications from any browser is essentially a prerequisite for the public cloud model and helps make internal IT more flexible as well. I'd argue that ubiquitous browser-based application access is one of the big differences between today's hosted software and Application Service Providers circa 2000.

Mobility and the consumerization of IT are also driving the move to applications that aren't dependent on a specific client configuration or location. For more than a decade, we've seen an inexorable shift from PCs connected to a local area network to laptops running on Wi-Fi to an increasing diversity of devices hooked to all manner of networks. Fewer and fewer of these devices are even supplied by the company and many are used for both personal and business purposes. All this further reinforces the shift away from dedicated, hard-wired corporate computing assets.

The expectations created by consumer-oriented Web services. The likes of Facebook, Flickr, 37signals, Google, and Amazon (from both Amazon Web Services and e-commerce services perspectives) have raised the bar enormously when it comes to user expectations around ease of use, speed of improvement, and richness of interface. Enterprise IT departments rightly retort that they operate under a lot of constraints--whether data security, line-of-business requirements, or uptime--that a free social-media site does not. Nonetheless, the consumer Web sets the standard and IT departments increasingly find users taking their IT into their own hands when the official solution isn't good enough. This forces IT to be faster and more flexible about deploying new services.

And none of these trends really had a single pivotal moment. Arguably, virtualization came closest with the advent of native hypervisors for x86 servers. But, even there, the foundational pieces dated to IBM mainframes in the 1960s and it took a good decade even after x86 virtualization arrived on the scene to move beyond consolidation and lightweight applications and start becoming widespread even for heavyweight business production.

The richness of Web applications and the way they're accessed are even more clearly evolutionary trends which, even now, are still very much morphing down a variety of paths, some of which will end up being more viable than others. Developments like HTML5, Android, Chrome OS, smartphones, tablets, and 4G are just a few of the developments affecting how we access applications and what those applications look like.

Collectively, there's a big change afoot and cloud computing is as good a term for it as any. But we got here through largely evolutionary change that has come together into something more.

And that's a good thing. New computing ideas that require lots of ripping and replacing have a generally poor track record. So the fact that cloud computing is in many ways the result of evolution makes it more interesting, not less.