X

Thin client computing grows up

In the past 20 years, IT has faced many hurdles when trying to move toward a thinner, more manageable form of client computing. Those hurdles have since fallen.

Jonathan Eunice Co-founder, Illuminata
Jonathan Eunice, co-founder and principal IT adviser at Illuminata, focuses on system architectures, operating environments, infrastructure software, development tools, and management strategies in networked IT. He has written hundreds of research publications and several books.
Jonathan Eunice
5 min read

I've been following the evolution of client-side computing off and on for over 20 years. Remember ASCII terminals? Green screens? Beehives? X terminals? If you do, they're most likely dimming memories.

The history of client side computing is filled with efforts to shift the balance of power between the server (ne host) and the client device. Which side is responsible for what, and how the sides communicate with each other, determine the cost, control, security, flexibility, and richness of the result. Some years it's "do everything meaningful on the server." Others, "do most work on the client." Over the years, pretty much every possible option has been explored and tried.

The "fat" personal computer (PC) is one of the most enduring modes of client computing. It achieves the highest level of flexibility and richness for individual users. But that high flexibility and functionality has come with a considerable cost in our ability to manage, backup, secure, and update those devices. The "networked personal computer" has been a growing fact of corporate life for about 25 years, but it's been a significant IT hassle for 24.962 of those years. Users get the flexibility; IT gets called in to make them work.

It's no wonder that IT has been trying to establish a measure of control, protection, coordination, standardization, and efficiency around end-user PCs for almost their entire life. But that's proved difficult and expensive. Give users "their own system," and they treat it as such. How dare IT fiddle with it! But if the hardware fails, or it gets a virus, why didn't IT do something about that?!

Computer screen
Openclipart.org

One of the longest-standing hopes of IT has been moving toward "thin clients"-client devices and a related style of computing that provides a degree of user interactivity rivaling that of a fat PC, but that relegates the lion's share of application processing and data storage to back-end servers, where IT can do its provisioning, updates, backups, security, auditing and other assorted management tasks efficiently. Thin client computing has been pitched, in various forms and incarnations, since the early to mid 1990s, but a variety of limitations have limited the number and kind of users for whom it was applicable. Thin clients and presentation virtualization were deemed fine for clerical and call center workers, for example, who needed low functionality yet their employers needed very low cost of operations. It was also considered applicable for those in highly regulated environments, in which control and data security were paramount. But general users? Not so much. And power users? Almost never.

Times are changing, however--to the point that thin client computing, in one shape or another, now suits a large swath of the market. What changed? Almost everything.

The Network Networks used to be adjuncts to the work done on computers. They were low-bandwidth, and not always available where and when you needed to work. Today the network is the center of IT. Individual connections in homes, hotels, airports, coffee shops, and all manner of other places have multi-megabit connections, either wired or wireless. Servers and data centers routinely have multi-gigabit connections. When you want to drive high-interactivity, data-rich sessions to many end users, having great networks is a key. We now have those.

The Server Most servers suitable for driving modern client sessions--that is, Windows, Unix, and Linux servers--simply weren't up to dozens, hundreds, or thousands of concurrent user sessions. But over time, Moore's Law provided enough CPU horsepower, as well as multicore and multithreaded designs that are custom-fit to handling lots of independent workloads. Operating systems have gotten much better at handling concurrent activities. Virtualization is the real kicker. Even low-cost servers can today effectively run dozens or hundreds of concurrent activities.

The Consumer The ubiquity of laptops, WiFi, tablets, and 3G smartphones has changed user expectations of "anywhere computing" from a "that might be nice to have, someday, if it didn't cost too much" thought to a "want it, need it, now!" requirement. There is a growing expectation that you'll have quality access to applications from pretty much wherever you are, on pretty much whatever device you like.

The Technology Many of the hurdles encountered in previous runs at thin client computing made the approach only partially successful. But insufficient networks, limited server consolidation, low performance client devices, and poor multimedia quality have been largely addressed with 20 years of technology evolution. The latest wave of innovations include multimedia accelerators (e.g., TCX and PCoIP), application virtualization (e.g., App-V), and user virtualization (e.g., AppSense)--things that dramatically improve thin client computing for both end-users and administrators.

Industry Approach In the old days, if you asked about thinner client computing or server-centric desktops, everyone had a specific, quite rigid answer. Microsoft: Terminal Services. Citrix: presentation virtualization. Wyse: thin client devices. Over the years, however, almost all the major players have broadened their product lines, with different options aimed at different use cases and different requirements. There is much more of a "continuum" approach, both from product and service providers. This has led to a much more mature discussion with less religious bias and much better coverage of actual user and deployment needs.

The Competition Desktops and laptops used to be the ne plus ultra of client computing. Now they're just one option competing with phones, Web apps, iPads, and other devices. Web 2.0 apps and Software as a Service (SaaS) have given traditional desktop apps an especially strong run for their money, becoming more functional and interactive every year, while having a very low on-premises footprint and management requirement. Stronger competitors mean that Microsoft, Citrix, and everyone else involved in PCs, thin clients, and desktop virtualization are stepping up their game, too.

Make no mistake, client computing options are getting thinner. Even when executed on traditional "fat" PCs, more and more applications, user sessions, and interactions are being provided via "thin" or "thinish" techniques: remote desktops, application virtualization, and rich Web apps. And completely thin client devices are far more able. Because any change to desktop computing happens in full view of end-users, change is more gradual than in the server room. But given the enormous cost, flexibility, manageability, and service level wins that server virtualization has wrought in recent years, IT is eager to extend those same advantages into the client domain. User requirements, available technology, competitive pressures, and industry maturation all point to a thinner client computing experience in coming years.