X

Computers for the people

Budding mobile computing designers will win the race for the prize if they remember their customers: they're just humans.

Tom Krazit Former Staff writer, CNET News
Tom Krazit writes about the ever-expanding world of Google, as the most prominent company on the Internet defends its search juggernaut while expanding into nearly anything it thinks possible. He has previously written about Apple, the traditional PC industry, and chip companies. E-mail Tom.
Tom Krazit
4 min read

SANTA CLARA, Calif.--Designing a user interface for a mobile computer isn't hard; all you have to do is think like a person.

Sounds simple, but it's taken a long time for that realization to set in, said Stu Card, manager of the user interface group at the famed Palo Alto Research Center. Card joined fellow researcher Ted Selker of MIT's Media Lab at Sofcon 2008 to discuss human interfaces for mobile computers, and just how differently engineers have to treat these devices than their older PC brothers.

MIT's Ted Selker interacts with a toy dog equipped with a sensor that can recognize his mood based on his eye movement, and react accordingly. Tom Krazit/CNET News.com

PCs weren't necessarily designed for end users in the early days. They were designed for developers to create applications, or corporations to make their workers more productive. But mobile computers, whether they are smartphones, mobile Internet devices, or whatever, are fundamentally different; they're with us at all times and are used on the go, not as stationary, sedentary terminals. And they are used as social devices, whether that's planning a get-together with friends, taking pictures at the party, or as the ultimate arbiter of extremely important barroom arguments such as who had the most home runs for the 1993 New York Mets (Bobby Bonilla).

Card focused on the look and feel of the software that accompanies smartphones. He used Apple's iPhone as his example, and examined how the iPhone was designed according to four different human factors: social, rational, cognitive, and biological. The different factors represent the amount of time one spends on a task or problem; you might take a second to page through a library of pictures, but spend months or years developing a network of friends.

"Mobile computing is much more intimately tied to a user's life. You need to design simultaneously on at least four levels, and functional design is not the only requirement," Card said.

Apple made the breakthrough it did with the iPhone because it came up with ways of interacting with the device that make sense on biological and cognitive levels, Card said. Translated, that means the iPhone plays well to natural perceptual and motor skills, as well as our desire for immediacy.

For example, the notion of finger gestures as the primary control is much more intuitive than navigating through a series of menus, and makes the device more intimate. And Apple's groundbreaking decision to put the browser first and the keypad second makes browsing much easier and compelling than other mobile devices.

As you move to the higher levels of mobile computing--the rational (problem-solving) and social (in short, event planning)--the computer itself takes on the role of a sensor, Selker said. "(It's about) using sensors and virtual sensors to understand and respect human intention."

Selker created the ThinkPad's TrackPoint for IBM, and has been working on human-facing design for years. Right now, he's working on adding sensors and computing capabilities to all kinds of commonly used devices, from bike helmets to toy pets.

The idea is that anything can be a sensor, and anything can take input from the world and provide feedback to the user. This sounds like a key part of the future development of mobile phones, where phones change from two-way voice and data communications devices to capture and analyze all kinds of data, such as location, weather, and even mood.

Selker demonstrated his smart bike helmet, which turns off his MP3 player if a loud noise (such as a horn) is detected around his bike. At some point, he thinks it might be able to prevent the urban cyclist's nightmare: getting doored. If a sensor on the bike could detect some basic information, such as whether someone is sitting in the driver's seat and the car is off. That's the most likely scenario for someone to start opening their door, and could send some sort of "SOS" alert to the user.

The point is that these computers need to be uniquely human; they need to act as extensions of our five senses, present us with information, and be usable in a way that naturally makes sense. The point-and-click menu-driven environment of desktop or notebook applications is not going to fly in the mobile world.

Human behavior has already evolved as we've grown more mobile. Think of college students who wait until the last minute to make definitive plans, but have had some vague understanding with their friends that they'd all get together at some point during the week, which Card called "microcoordinating."

This custom has derived from nothing more than simple cell phones and text messaging, and those folks will want their computers to be designed around them. Some combination of intuitive interfaces and sensory perception will carry the day.