Five big business techs of the decade

While a number of these technologies have their roots in the prior decade, it's the noughts that saw the pieces fall into place that let them truly change the landscape.

I've been an IT industry analyst for almost 10 years. I've seen many technologies come, go, or fail to even arrive in the first place. However, during that time, a few techs have emerged that play a big part in fundamentally defining how businesses do computing. Most first emerged prior to 2000, but it has been during the past decade that they've truly changed things.

1. x86 processors were already well entrenched in corporate computing by the end of the 1990s, especially in their role as the "(In)tel" part of "Wintel" servers running Windows NT. However, their dominant designer and manufacturer, Intel, was heading in a different direction to handle the inevitable transition to the 64-bit processors and operating systems needed to keep pace with growing memory requirements.

That new direction was Itanium, a clean sheet processor design by Intel and Hewlett-Packard intended to get away from all the legacy features of x86 and--not incidentally--cut the x86-compatible processor makers out of the picture. The Itanium family remains with us but primarily as a processor for high-end HP servers. It was AMD that first added 64-bit extensions to x86 but Intel felt compelled to follow. And it was this backwardly compatible version of x86 that is the mainstream 64-bit server processor, not Itanium.

2. The other big processor story of the decade is multicore. Near the end of 2000, Intel introduced the Pentium 4 processor based on the NetBurst microarchitecture. It was intended to eventually hit about 10GHz. In fact, it never got beyond 4GHz and came to be viewed as the last gasp of performance scaling through frequency.

AMD introduced its first multicore x86 Opteron processors for servers in 2005 which helped it gain market share for a time while Intel made major changes to its development plans and processes. IBM and Sun also aggressively pursued multi-core in their RISC lines. Specialty processors such as Azul's Vega and Tilera's TILE lines went even more radically multicore. In short, frequency is largely dead as a path to higher system performance, which will require a combination of more cores and specialty accelerators working in parallel.

3. When I first met Diane Greene, co-founder and then-CEO of VMware in the fall of 2000, VMware was already selling a product to developers that let them run multiple operating systems on a single workstation. But Diane was in town to pitch me on something new, a pair of new server virtualization products--GSX and ESX Server--that made it possible to consolidate multiple workloads on a single physical server and to provision them more quickly.

The basic concept goes all the way back to IBM's involvement with early developments in time-shared computing in Cambridge, Mass., during the early '60s. And all the RISC/Unix vendors of the time had their own approaches to slicing and dicing servers. However, it was VMware that brought server virtualization to the masses. Its product ran on standard x86 servers and it provided a way to consolidate workloads right at a time when IT purchases were dramatically slowing and anything that could save money was in vogue.

EMC bought VMware in 2003 for $635 million, a figure which it's hard to believe today was widely viewed as an overpayment. Today, server virtualization--an area where VMware remains the 800-lb. gorilla despite Microsoft's entry--continues to fundamentally change the way IT departments think about operating their data centers. Virtualization also underpins much of cloud computing, another major developing trend.

4. Linux and other open source were a big part of the dot-com and service provider build-out of the late 1990s.

But enterprises? Not so much. This 2001 research note had to argue that Linux was, in fact, ready for serious production use. And, whether "ready for the enterprise" is a meaningful question in the abstract, the fact remains that the Linux 2.4 kernel was widely regarded as the first version deserving of that description and it wasn't released until mid-2000. IBM began its big Linux push at about the same time.

Thus, I'd argue that it's been this past decade and not the prior one that has seen Linux and open source truly become a pervasive part of computing. That's not to say that open-source has replaced all other software. But it has heavily influenced how companies do development, engage with user and developer communities, and provide access to their products--even when the software in question is proprietary.

5. My last entry has the greatest overlap with the consumer space. That's not a coincidence, given that mobile devices are a very visible example of what Citrix CEO Mark Templeton calls the "comsumerization of IT."

Mobile devices encompass at least a couple of different things. The most obvious entrant is probably the smartphone--first in the guise of the BlackBerry and more recently the iPhone. We are now at the point where you can carry a bona-fide computer in your pocket, complete with GPS and other sensors, and can run applications that you install. As my colleague Jonathan Eunice has noted, it really is a transformational experience relative to, say, my older Treo. It also represents the reality of the modern smartphone that, for many, it's increasingly about mail, texting, and social media and not, you know, phoning.

However, the smartphone doesn't deserve all the limelight. The noughts have also seen the laptop computer transform. I'm not talking about the form factor so much--although Netbooks have gotten their share of attention. Rather I'm talking about the way that we can use them.

I've had laptops since the 1990s but it wasn't until about 2001 that conferences and other venues started to put up Wi-Fi networks. They worked fitfully (some things haven't changed as much as we might like), but this was the beginning of the connected laptop rather than the merely mobile laptop.

And that's why I see the smartphone and the laptop as part of the same mega-trend. It's not about a particular form factor or usage model. It's about (almost) always being connected to applications that increasingly live largely in the network.

Featured Video