X

When multitasking trumps immersion

Immersion implies single-tasking. And people increasingly multi-task much of the time.

Gordon Haff
Gordon Haff is Red Hat's cloud evangelist although the opinions expressed here are strictly his own. He's focused on enterprise IT, especially cloud computing. However, Gordon writes about a wide range of topics whether they relate to the way too many hours he spends traveling or his longtime interest in photography.
Gordon Haff
3 min read

I had business on the East Coast this week which kept me from heading out to California for Cisco's C-Scape Global Forum 2008. However, twitter offered me the opportunity to "eavesdrop" on some of the back-channel chatter at the event.  Unsurprisingly, there was apparently lots of talk about video and collaboration enabled by video.

Unsurprising because this has been a drum that Cisco's been beating for a few years now. And for good reason. Cisco makes networking infrastructure. Video consumes networking infrastructure. So, if you're Cisco CEO John Chambers, that makes video a very good thing.

More video and higher fidelity video would also seem to be part of the general theme of computing getting more immersive, more realistic, and more multimedia. After all, can anyone seriously doubt that we've all inexorably moved from a black and white rabbit ears world to one where we can stream high definition movies to our flat panel televisions with a few clicks of a remote? Or that most gaming, even if not "realistic" in the literal sense, is clearly about immersing the player into a universe using as many sensory hooks as possible? Text adventure games may well have drawn their "graphics from the limitless imagery of your imagination" as the old Infocom ad said, but most modern gamers prefer Halo III on an Xbox 360.

However... There's a problem with this storyline. The supporting evidence is all well and good but there are also some compelling counter-examples. Take communications technologies for example.

A prototype of the first Bell System PicturePhone was developed in 1956. And the years since have seen all manner of technologies and products that aimed to make stepping up to a camera as commonplace as picking up a telephone. But whether Intel's ProShare so beloved of then Intel CEO Andy Grove, Webcams, or videoconferencing systems from the likes of PictureTel, video has never developed as a mainstream following as a way to augment audio (outside, of course, of professionally-produced broadcasts).

You could argue--and you would be correct--that a lot of the products and technologies that I described were pretty low fidelity. However, today we have the technology and network infrastructure to deliver much better levels of video quality. And, where there is a genuine need for next-generation, dedicated-room video conferencing, there does seem to be a nascent market developing with impressive products from companies such as HP and, yes, Cisco.

However, when it comes to communicating 1:1 or even a small team collaborating relatively casually, video still often seems more hindrance than help.

I think a big part of the issue here is that immersion implies--indeed almost requires--single tasking. Yet, for better or for worse, many see the latest generation entering school and the workplace as highly multitasking. Helen Laggatt's description is typical of the group:

Millennials multitask, and they do it well. This is the generation most likely to be sat in front of the television while listening to their iPod, texting their friends and surfing the Internet. They like to personalize their online and mobile environments, able to source and install tools to make life easier.

More broadly, it's hard to dispute that just about all of us are far more inclined to split our attention among multiple activities. And this is reflected in our tools. IMs and emails often trump voice, not because they're "better" in some absolute sense (though they can be for various reasons), but because they can be sent unobstrusively from a cell phone or a laptop in the middle of a busy meting.

Immersive computing is a real trend. We're not going to return to CGA graphics and tinny PC speakers. But it doesn't happen everywhere and it probably never will.