Virtualization is a major component of cloud computing, but the primary focus has been on virtualized server instances running on cloud providers such as Amazon EC2.
There is little argument that applications running in the cloud offer many attractive advantages, but ultimately users need to be able to access their data from any device and the data itself must maintain the highest levels of synchronization and integrity.
One of the big challenges is the fact that users are comfortable with fat applications (generally meaning, not browser-based) for a large number of tasks. And while Google Docs and the like are great supplements for large apps like Microsoft Office, they've yet to supplant them completely.
To avoid disrupting users too much, and avoiding the necessity of rewriting applications to be browser-based, many IT organizations turned to desktop virtualization tools to solve the problem.
Desktop virtualization was supposed to solve one the biggest headaches in IT: managing and securing corporate desktops in a more effective way. From IT's perspective, it's much quicker and easier to manage all the company's virtual desktops than physical distributed desktops. When it comes time to roll out a software update, they can automatically deploy it to all virtual desktops at once, rather than going machine-to-machine and uploading the software manually.
There are many options when it comes to desktop virtualization, but generally the "solution" most people think of is virtual desktop infrastructure (VDI).
VDI has a number of strengths such as cost reduction and management efficiency as well as a number of drawbacks, such as a lack of offline capabilities. But the core problem is that VDI puts an entire desktop that's not built for the cloud into the cloud. It just doesn't make sense to take your fat desktop and OS and stick it right onto the cloud.
This is hardly a public-cloud problem. If you are a customer at a bank that's merged or been acquired in the last few years (and who isn't) you can see the challenge in real life as your banker is forced to fire up VMware installations to access different systems.
Monday I watched a banker friend bring his laptop to a crawl with his third VMware Windows launch--all necessary to access the different applications from the various banks that were consolidated. Browser-based applications may not have solved the problem entirely, but the waste of time and resources required to keep multiple versions of what's supposed to be the same OS seems completely ridiculous.
I started to wonder what will the virtual desktop look like in five years when cloud is more mature and widely adopted? Some are arguing that the cloud will kill VDI, and I tend to agree, but what will take its place? Or, is it that VDI is a stop-gap solution on the desktop, while virtualized servers will become even more robust?
I chatted with Purnima Padmanabhan, vice president of products at MokaFive, a desktop management company. She told me the main driver for VDI deployments today is that companies want to centralize management for applications and data that live on the desktop.
According to Padmanabhan, there are two types of apps: those that should be run in the cloud, and those that should not. There are many apps that need to be run centrally and can be done so using terminal services. More and more of these centrally run apps are also being rewritten to be native cloud-based apps, which will eliminate the need to put the full desktop on the cloud.
The problem with VDI is that it doesn't fit either model--that is, native cloud apps or locally executed apps. You get neither the performance of a local app, nor the collaboration and simplicity of a cloud-based app. It's the worst of both worlds.
With the continued growth of the cloud and alternative desktop and application delivery mechanisms, it seems pretty clear that VDI will remain entrenched in big corporations. And, maybe that's the best use at the moment. But it's not the way the future appears to be playing out.