X

Computing from the bottom up

The IT industry is in a qualitatively different place than it once was. The enterprise architects still have a job to do.

Gordon Haff
Gordon Haff is Red Hat's cloud evangelist although the opinions expressed here are strictly his own. He's focused on enterprise IT, especially cloud computing. However, Gordon writes about a wide range of topics whether they relate to the way too many hours he spends traveling or his longtime interest in photography.
Gordon Haff
3 min read

Time was when most enterprise software came in the front door as part of a formal, signed-off-at-the-highest levels procurement process. Or it got written in-house as part of an equally formal, multi-year development plan. Or some combination of the two. You didn't expect that expensive packaged software you bought to just work out of the box did you?

Lots of software still gets purchased and developed that way of course. However, the truly striking story of the past decade is how so many of the tools and other software that we take for granted today are essentially bottoms-up phenomena. They largely came in the back door and made their way into what's often called the "Shadow IT" of organizations. Official IT didn't make this software ubiquitous and mainstream. For the most part, it was already ubiquitous and mainstream by the time IT departments got around to blessing it.

Linux (and, more broadly, open source in general) is perhaps the canonical example of this trend. In some respects, Linux adoption just mimicked past adoption patterns for distributed computing in general--from Windows NT servers to PCs and even Unix in the early days. However, open source licenses make backdoor sourcing one big quantum step easier. Indeed, the basic idea that open source licensing helps to build a developer and user base that can then be monetized when the software goes into production underpins a lot of the thinking around business models associated with open source.

However, the trend goes well beyond open source. Consider the following two examples.

Especially as workforces get more distributed, tools such as Novell Teaming + Conferencing and Lotus Domino have moved beyond e-mail and calendaring to encompass a much broader set of formal and informal interactions within a company. Cisco CEO John Chambers has said that "collaboration" is the one word that describes where his company and the entire technology industry is headed.

However, beyond e-mail and calendaring, it's been instant messaging that's probably been the tool with the biggest impact, rather than something bigger and more architected. And IM came in from the consumer space, often informally. Indeed, many organizations still just use freebie IM from AOL or Yahoo or Google rather than some enterprise version.

Even the red-hot virtualization trend is an example of bottoms-up. One of the reasons that virtualization was able to really break out was that it lent itself to small, local IT optimizations with immediate payback. You could install VMware on just one or two servers and see an immediate benefit. You didn't need to rototill your data center management and change any number of business processes.

Today, the next phase of virtualization--which goes by terms like Dynamic IT--is indeed a broader concept requiring a more deliberate and phased approach. But it got its start at the small scale (which distinguishes it from many of the virtualization management solutions being touted today that are only truly useful at data center scale).

The downside of all this ad hoc-ism is that it can lead to tools that can't really grow or that don't have other characteristics--such as reliability--that become more important as usage transitions from casual to business-critical. (See twitter.) But that genuine caveat aside, the IT industry is in a qualitatively different place than it once was. The enterprise architects still have a job to do. But no small part of that job is now integrating with tools that users and departments have brought in on their own.