Breaking the expensive computer mindset

The way we think about new types of devices still often has a foot in a past where computers were expensive and scarce.

Computing is cheap. Both by historical standards and compared to many other machines and services that we purchase. All of us appreciate that intellectually at some level. But, when it comes to thinking about which devices make sense and which don't, it often seems as if we're treating computing like it's a scarce and expensive resource.

I see this tendency again and again when discussions turn to new types of devices or software such as Google's Chrome OS . I often get asked when will a certain shiny-new-thing replace desktops running Windows or some other existing gadget.

Far more often, the better lens to use is whether the newness fills a legitimate need for some group of potential users, whether or not it takes the place of something that already exists. That's because it just isn't a big deal to add an incremental device to our entourage.

This is not to say that we want to mindlessly proliferate stuff. There's a "care and feeding" aspect to electronics. This is especially true as we move towards general purpose computers with their incessant appetite for updates and upgrades. Mobile gizmos of all sorts also need their chargers and cables and their data needs to sync with other devices in individualistic ways. Especially in mobile, we're willing to tolerate sub-optimizations to reduce personal clutter. For a lot of people, the current generation of smartphones can replace a dedicated cell phone, a BlackBerry, MP3 player, camera, e-book reader, and even a GPS.

But I think we collectively expect more convergence to happen than does, in fact, occur. There are just so many design compromises and trade-offs associated with using one device for multiple tasks.

Even in the mobile arena, any halfway serious photographer will want a separate camera. Someone who wants to do a lot of reading will probably prefer something with a larger screen than a pocketable smartphone. And, while Google's GPS application for Android sounds really interesting for occasional use while traveling, with dedicated GPS units starting under $100 I'd probably go that route if this was a device I wanted to use all the time.

In the home, the so-called "3-foot" versus "10-foot" experience is one thing that keeps devices separate. Standard keyboards and mice don't fit well with the 10-foot living room experience, yet entering all but the most limited amount of text is essentially impossible without them. The user interfaces and applications for this setting have correspondingly evolved to involve simple pointing and clicking with a minimum of typing.

But it's more than a case of having different types of devices for different purposes. That assumes that each computer serves a unique purpose.

In fact, there's no more particular reason to limit the number of computers around a house than there is to limit the number of clocks. This will be ever more the case as prices come further down, our applications and data increasingly live in the network, and we'll start to see devices that are optimized to be complementary to a main computer or computers.

About the author

Gordon Haff is Red Hat's cloud evangelist although the opinions expressed here are strictly his own. He's focused on enterprise IT, especially cloud computing. However, Gordon writes about a wide range of topics whether they relate to the way too many hours he spends traveling or his longtime interest in photography.

 

Join the discussion

Conversation powered by Livefyre

Don't Miss
Hot Products
Trending on CNET

HOT ON CNET

Delete your photos by mistake?

Whether you've deleted everything on your memory card or there's been a data corruption, here's a way to recover those photos.