X

Why 'drop in' doesn't always fit

Appliances and integrated stacks have long promised to simplify life for IT operations. But it often hasn't worked out that way.

Gordon Haff
Gordon Haff is Red Hat's cloud evangelist although the opinions expressed here are strictly his own. He's focused on enterprise IT, especially cloud computing. However, Gordon writes about a wide range of topics whether they relate to the way too many hours he spends traveling or his longtime interest in photography.
Gordon Haff
2 min read

Certain ideas lurk largely at the boundaries of the IT industry, periodically making a push for a more central role. One such is the appliance or integrated stack--an assembly of hardware and multiple layers of software from a single vendor.

The argument for this concept revolves around simplifying the acquisition of technology and optimizing its operation.

Of course, vertical stacks were once simply the-way-systems-were-built. This model largely gave way to horizontal layers such as microprocessors, operating systems, and databases developed by different specialist vendors and brought together at the end user. (Former Intel CEO Andy Grove describes this shift in his book "Only the Paranoid Survive.")

Misplaced snowman Richie Diesterheft/CC

However the "Web 1.0" era, circa 2000, brought vertical integration to the distributed systems world in the guise of so-called appliances, many intended to plug into the network and perform some newfangled Web-by function such as Web serving or video streaming. Cobalt Networks was perhaps the best known and most sophisticated but there were many of them, most of which wouldn't exist within a few years. For their part, many of the large system vendors also established appliance divisions. Those would soon be shuttered as well.

Appliances promised simplification and optimization but, in practice, they were widely viewed as too narrow and inflexible. Even software-only versions leveraging virtual machine technology have seen far more uptake as a way to distribute demos than as a way to deploy production applications. The fundamental issue is that, even though users are ultimately interacting with the application, it isn't really possible to fully abstract away and ignore many of the underlying pieces. The specifics of components like operating systems and servers have important implications for IT operations--however they're packaged.

As James Urquhart notes on his Wisdom of the Clouds blog: "Even if, say, a vendor solution is a "drop in" technology initially, the complexity and tradeoffs of a long-term dependency on the vendor adds greatly to the cost and complexity."

This highlights something that's been a major stumbling block for a lot of integration plays in a distributed systems world. Many technologies and products that may make sense in the context of a "green field" deployment make a whole lot less sense when they have to work alongside existing networking, storage, servers, operating systems, and so forth. Furthermore, even if integration makes something easier to initially install, that doesn't necessarily make maintaining it any easier. In fact, it can make updates and upgrades harder by introducing dependencies and requirements that are specific to a single platform.

As an IT industry analyst during the first server appliance boom, there was one question that I asked over and over. "How is hooking together a bunch of boxes from a bunch of different companies to perform a bunch of discrete functions going to simplify things?" I never got a good answer then and, even if the two situations aren't completely comparable, I'm not sure how the current talk of integrated stacks resolves this fundamental question either.