FAQ: Detangling virtualization

CNET News.com takes some of the mystery out of a complicated technology that has a steep learning curve.

Stephen Shankland principal writer
Stephen Shankland has been a reporter at CNET since 1998 and writes about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science Credentials
  • I've been covering the technology industry for 24 years and was a science writer for five years before that. I've got deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and other dee
Stephen Shankland
6 min read
For anyone buying servers or server software, and even many buying PCs, virtualization is getting hard to avoid.

The term typically refers to running multiple operating systems simultaneously on the same computer. It's long been around on high-end servers, but new software and hardware options mean mainstream users are starting to have to worry about virtualization. For example, both major commercial versions of Linux now have virtualization built in, and the next version of Windows for servers will, too.

Virtualization is complicated. But there are reasons you might want to take it seriously.

Mac users can run Windows to tap into the corporate e-mail system, or someone with a Windows Vista PC can run software that will only run on Windows XP. But in practice, the technology today is most likely to appeal to server customers, with advantages ranging from scrapping old hardware to cutting electricity bills.

Virtualization is a classic case of disruptive technology with a steep learning curve. For example, at the upcoming HP Technology Forum, there are 84 presentations to help Hewlett-Packard customers understand virtualization.

Here are some answers about what's happening today with virtualization.

What exactly does virtualization mean?
The term virtualization means that software is running on some sort of virtual foundation rather than the physical hardware it typically expects. Instead of a single operating system controlling a computer's hardware, the virtualization software controls it, providing multiple compartments called virtual machines for the operating systems to run in. Inserting a virtual layer can be liberating. For example, a running operating system can be moved to a fresh server if the one it's running on is suffering a failing memory bank or overtaxed processors.

Virtualization actually has been around the computer industry for decades, for example to run multiple jobs on mainframe computers or to hide the particulars of individual hard drives in a storage system. But now, it's no longer just a high-end technology.

Why is virtualization catching on now?
Because the technology is maturing and can help fix some common problems. Much of the credit for making virtualization a reality goes to an EMC subsidiary called VMware, which brought the technology to computers using mainstream x86 processors such as Intel's Pentium and Advanced Micro Devices' Opteron. In the first quarter of 2007, VMware's revenue grew 96 percent from the year-earlier period to $256 million, so there's no doubt the market is real and growing fast.

VMware built its business gradually. It began on desktop computers, where programmers could harmlessly test crash-prone new software in virtual machines or run Linux and Windows on the same computer, for example. In more recent years, the company's server software business became more lucrative as virtualization enabled customers to replace several inefficiently used servers with a single server running multiple virtual machines. Now the company is moving to a grander virtualization-based vision in which multiple tasks can run with shifting priorities on a pool of centrally managed machines.

Do I get a choice of suppliers here?
Plenty of competitors want a piece of VMware's action. First on the scene was Xen, an open-source project sponsored by Linux sellers, server makers and a start-up called XenSource. Virtual Iron is another start-up that's trying to make a business out of Xen. On the proprietary software side of the industry, Microsoft acquired a company called Connectix to counter VMware's products, but has had only modest success. The real fight will begin by June 2008, when the forthcoming "Longhorn Server" version of Windows gets updated with virtualization software code-named Viridian. Despite the fact that Xen is here now, VMware marketing director Bogomil Balkansky said Viridian is his top concern.

Although Xen got the jump, a newer open-source virtualization project called KVM has stolen some attention. Red Hat and another Linux rival, Canonical's Ubuntu, have blessed KVM, and many Linux programmer heavyweights like its approach.

Another flavor of virtualization lets a single operating system be carved up into several virtual compartments, a lighter-weight approach that's been popular for Web site hosting. SWsoft's Virtuozzo, based on the open-source OpenVZ project, employs this approach, while Sun Microsystems' Solaris built the technology into its Solaris 10 operating system. Microsoft has said it's considering a similar move for Windows.

Useful technology, lots of buying options--sounds swell. Why doesn't everybody do this?
Mainly because it's new to most people. Also, it can hurt performance as virtualization software intercepts communications between hardware and software, and to use it, computers need more network capacity and more memory. Virtualization also adds a new level of complexity, and administrators must test it with their hardware and software.

It doesn't sound so complex to me. Software just runs in a different compartment, right?
Consider some of the repercussions of unshackling software from its hardware. Much server software is priced on the basis of how many processors a server has. What happens when, through virtualization, you're running a particular application on two of a computer's four processors? Then what happens when you boost the virtual machine size to three processors? And how about moving that virtual machine over to a different system altogether? The software industry has only begun adapting to the new reality.

Here's another wrinkle: some software, during installation, records what amounts to the hardware fingerprints of the computer it's running on to counter piracy, while other packages require a hardware "dongle" to be attached. So there are serious constraints to shuttling virtual machines around with abandon.

OK, now I'm intimidated. Is this just a fad that I can wait out?
If you're a server administrator, you probably can't and shouldn't avoid virtualization forever. Xen now is built into both major commercial versions of Linux--Novell's Suse Linux Enterprise Server and Red Hat Enterprise Linux. And Intel and AMD are racing to build virtualization into their chips. Newer processors from both companies have hardware support for some virtualization tasks, making it possible to run Windows on Xen, for example. Future features will improve performance of memory access. Virtualization on the PC, though, isn't likely to catch on widely anytime soon.

What can I do with virtualization on a PC?
Software from Parallels will let Mac users run Windows on the newer Intel-based machines. VMware is working on its own software, called Fusion, to accomplish the same end. That can be handy when Mac users need to fit in better with a Windows-dominated world. For Windows users, VMware's player software can be used to try out Linux, run older software on a newer system, and isolate personal and work tasks. Intel thinks administrators will like to run their own management software in a separate virtual machine, letting them fix worm-infested PCs remotely. Developers get the ability to debug programs in virtual machines that can simulate diverse combinations of software and that don't corrupt hard disk data if they crash. For administrators, another nice PC virtualization technology involves replacing a standalone system and running virtual PCs on central servers to cut energy and maintenance costs.

VMware offers free versions of its software, but there are other fees. To run Windows on a Mac, you need a full--not upgrade--version of the operating system. And with Vista, the restrictions get even tighter: only the pricier Ultimate and Business versions are permitted. Businesses with a volume license agreement with Microsoft may run up to four instances of Windows Vista Enterprise on a single PC, but others must pay for each copy.

What are the server costs?
Xen is built into Red Hat Enterprise Linux and Suse Linux Enterprise Server at no extra cost beyond the support subscriptions. Novell customers may run as many SLES virtual machines as they want on a single computer for one support subscription. Red Hat prices similarly with its RHEL Advanced Platform version, but imposes a four-virtual machine limit for its basic RHEL Server version.

VMware's prices have come down. For example, its former GSX Server product became the free VMware Server product. But there still are significant fees. For a two-processor machine, ESX server and higher-level components that make up the company's Virtual Infrastructure 3 product cost a minimum of $1,675 for a dual-processor server, including support and subscription costs. The fuller-featured Enterprise version of that product costs $6,957 for the same hardware. Doubling a server's processor count doubles the price. It sounds steep, but it's still likely to be less expensive than buying a new server or three.