That's the gist of what academics and engineers told IT workers gathered here this week for the three-day Association for Computing Machinery conference. The event is typically a sort of group hug between computer programmers and scientists, but the mood turned a tad nasty Tuesday as researchers lightheartedly ripped on computer scientists, who made up the bulk of the 200-member audience.
Industrial designers poked fun at virtually all facets of computers and other electronic gadgets, and the Apple iMac--displayed in PowerPoint presentations in its groovy new shades--bore the brunt of scorn and jokes about how fashion has superseded functionality. One presenter went so far as to blame the nonintuitive, nonhuman-oriented design of desktop computers for the current economic slowdown that has ravaged the broader technology sector.
"The situation is really serious," said William Buxton, chief scientist for graphics software developer Alias Wavefront and associate professor in the Department of Computer Science at the University of Toronto. "Much of what is happening economically is because we are pursuing a foolish thing--growing PCs exactly as we have been in the past. I believe design is the key to get us out of this slump."
Certainly the speakers recognized other obvious targets of the downturn, including macroeconomic trends, cyclical patterns, overexuberance for tech stocks in the late 1990s and other issues. But the beige box sitting in millions of office cubicles and living rooms also deserves some blame, presenters said, because only a relatively small percentage of literate, technologically astute adapters can master it in its current configuration.
Targets of the critics' scorn included convoluted commands such as the common "Alt-Control-Delete" sequence used to close a program or perform an emergency shutdown. They also lambasted computer designers who refuse to distribute the machines' intelligence to smaller devices scattered throughout the home, instead insisting on packing a single box with maximum functionality.
Buxton and several other speakers said the fundamental design of the PC hasn't changed since the early 1980s, when the devices first became widely available to consumers. The devices have become smaller, faster and less expensive, but they imitate their ancestors in form and function.
"If Rip Van Wrinkle went to sleep in 1982 and woke up today, he'd be able to drive our modern computers with no problem because they're essentially unchanged," Buxton said, alternating between a slide of a 1982 computer and trio of iMacs in tangerine and other bright shades. "There'd just be more crap on it."
The essence of the speakers' complaints was that computer engineers have spent the last five decades designing computers around the newest technology--not for the people who use the machines. That has resulted in computers packed with technologically interesting but relatively useless features that have little to do with our daily lives. The vast majority of computers have few interactive features and are largely unable to forecast human behavior, Buxton said, rendering them less advanced than airport toilets that flush automatically when the user departs the stall.
"Shouldn't your computer be as smart as your toilet?" Buxton asked to a round of laughter.
Speakers compared modern PCs to Cuisinarts--highly functional, expandable machines that typically gather dust on kitchen shelves, largely unused by novice cooks because they're heavy, hard to move and too complex. They said the computing industry has largely lost touch with humanity and needs to reconnect by importing anthropologists, sociologists and regular users into the design and engineering process.
"We have a market of very confused customers and observers," said Martin Schuurmans, CEO of Dutch electronics giant Philips' Center for Industrial Technology. "We distinguish ourselves by the color and design, and...maybe a blinking antenna in Japan. I would call that a world fragmented with features, and...of course it cannot stay that way.
"Concentrate on the task, ladies and gentlemen!" Schuurmans begged the audience. "Don't concentrate on the technology or the tool. The PC in many respects has too many functions, and it does them too poorly. So there's all the reason in the world do look at different choices. Let the machine work for us humans--it's high time."
Michael Dertouzos, professor and director of the MIT Laboratory for Computer Science, said the IT industry has failed to create "human-centered computing" and instead requires people to have a relatively high degree of skill to perform the most simple digital tasks. For example, he said, users of Windows-based computers must know that to turn off the computer they have to click on "Start"--not an intuitive step to end a computing session.
Although the scientific community has learned many computing tricks, revolutions in fields ranging from genetics to astronomy will not occur until the computing industry makes fundamental changes to its machines, he said. Advances in speech recognition software, for example, will open up the Internet to the estimated 2 billion people worldwide who cannot read or write, vastly increasing the size of the Internet and the potential data collected on it.
"We have been building computers for 40 years, but they are not very different at the base level," Dertouzos said. "We're not exploiting this technology revolution. We're hardly scratching the surface."
The speakers were blunt in assessing blame for the technology industry's dysfunctional state. Most blamed technology-ignorant financiers who are only impressed by the most complex tools and gadgets--people who presume the most confounding technology must be the most advanced. Because of such faulty logic, the speakers said, Wall Street rewards needless complexity and shuns those who build the most simple, human-centric devices.
"Sand Hill Road is completely oblivious to this stuff," Rod Brooks, director of the Artificial Intelligence Laboratory and Fujitsu professor of computer science at MIT, said of the Silicon Valley's venture capital strip. "They're still scratching their heads, trying to figure out what happened to the dot-coms."
Brooks and others said the seeds of a human-centered computing revolution have already been planted. In the field of robotics, for example, the vaguely human look of the typical robot has created a generation of robot engineers who intuitively design and program the devices on a human scale.
He said the advent of simple consumer robots--even baby bots such as Furbies, sophisticated dolls that learn words that the child "teaches" to them by repetition--has vastly different ramifications than the advent of the first sterile computer boxes that started popping up in homes in the early 1980s.
"We're talking about the emotional coupling between the robot and the human," Brooks said. "We're pushing on having a human form just to...understand the relations between robot and machine."