It got to be a fairly long list. Unfortunately, I found myself doodling about the equivalent of a plate of tapas when what I really craved was the Tech Enchilada, something that truly redefined personal computing.
We've grown so accustomed to inch-by-inch advancement that that may sound like asking for the impossible. But I can offer a couple of examples, though each is admittedly getting long in the tooth, where big ideas led to big breakthroughs.
On the computer hardware side, the Macintosh was the most brilliant development of the last couple of decades. PC bigots will give me an argument, but I believe the Mac set a qualitative bar that other computer makers (and Microsoft) have struggled to meet, let alone surpass.
|On the computer hardware side, the Macintosh was the most brilliant development of the last couple of decades.|
You didn't need to be a geek to make the damn thing do your bidding. The machine didn't make you feel like a dodo. And it was actually fun to use. Unfortunately, that signal product debut took place in 1984. Back then Ronald Reagan was President and Saddam Hussein was considered our buddy. Apple Computer subsequently introduced modular improvements to the design and the operating system but never came close to coming out with a second "wow" product.
On the software side, the last big tech event was the Internet. Kudos to Tim Berners-Lee and Marc Andreessen's team at the University of Illinois, but give credit where real credit is due: The Internet was a belated throwaway gift, courtesy of the U. S. Defense Department. And the original DARPA project dates back some 40 years!
There's since been nothing to rival those two separate developments. I suppose that's the fun of being surprised, and I'll be more than delighted if some bright bulb next year invents a radical departure from the creeping advancement that's marked most of the last couple of decades of personal computing.
In the meantime, here is my revised shortlist of what I'd like to see in the PC goodie bag in 2003. For your consideration:
Reduce the number of operating systems
Why do we need so many? Please. I personally couldn't care which one(s) should survive. But it's time to do some early spring-cleaning. For starters, I came up with VxWorks, QNX, pSOS, eCos, BeOS, FreeBSD, NetBSD, OpenBSD, BSD/OS, SKYOS, z/OS, OS/400, MPE, PalmOS, Windows CE, NitroOS-9, OpenVMS, NonStop Kernel (NSK), TinyOS, GNU Linux, Minix, Multics, NetWare--and that's on top of Windows, Linux, Mac OS and Unix. There are probably others. What is clear is that with so many companies pouring so much money into promoting their pet projects, great strides in innovation will remain few and far between.
What with a PDA, a pager, a cell phone and a notebook, this has become a personal obsession. Can't they find a way to put all these--and other digital--devices in continuous connection with each other as well as with the Internet? Somehow, we need to take smart connections a big step up the ladder.
|What with a PDA, a pager, a cell phone and a notebook, this has become a personal obsession.|
The attentional user interface
Microsoft's lab folks have been playing around with this concept for a while. The idea is to make sure that updated information tracks you down wherever you may be in a form that makes sense under the circumstances. This is more than a Yahoo alert. The underlying system is smart and can track your movements throughout the day. The computer will be able to tell if you are in front of the machine or hiking in the Andes.
Inventor Steve Perlman talks about Morpheus and Napster as offering a glimpse of what the world will be like when you have this kind of disk storage at our disposable. Think of what you'll be able to do with a computer that can fit 1,000 hours of video content. And the cost need not be prohibitive. In fact, the price on a new hard drive doesn't really change all that much from one generation to the next.
The end of the master-slave relationship
Devices should be made subservient to their owners, not the other way around. The late (great) Michael Dertouzas, formerly director of the MIT Laboratory for Computer Science, told Scientific American that "we made a big mistake 300 years ago when we separated technology and humanism." He was spot on. It's about time to put the technology and humanism components back together. Why are we still stuck sitting in front of these dumb machines, staring at screens all day? Beats me; but it's clear who's in charge--and it ain't us.