X

Scratching the 25-year PC itch

CNET News.com's Charles Cooper celebrates the birth of the PC and the long-awaited emergence of "human-centric" computing.

Charles Cooper Former Executive Editor / News
Charles Cooper was an executive editor at CNET News. He has covered technology and business for more than 25 years, working at CBSNews.com, the Associated Press, Computer & Software News, Computer Shopper, PC Week, and ZDNet.
Charles Cooper
3 min read
I was never particularly big on anniversaries. But in a couple of weeks, the computer industry will mark a milestone that deserves a moment of quiet celebration. On Aug. 12, 1981, the IBM Corporation debuted the PC.

It's hardly hyperbole to suggest that this single announcement did more to change the world of technology than anything ever since--including the invention of the World Wide Web.

At the time, this was a gamble. Big Blue didn't make machines for the average Joe. Remember, this was a company that was all about Big Iron--and I mean really big iron in the form of mainframes, and then minicomputers. But the opportunity was too juicy to ignore. Up until then, the personal computer business was teeming with oddballs and small fry--along with the likes of Silicon Valley's soon-to-be poster child, Apple Computer. Interesting and colorful characters populated the map, to be sure, but you couldn't find a mogul among them whose products were coveted by the business world.
Some folks had believed it would be no time before we all were interacting with computers the same way the starship Enterprise crew did on "Star Trek."

All it took was the hard work of a gutsy team of engineers working in seculsion in Boca Raton, Fla., and a hard-driving executive named Don Estridge to convince IBM's senior management that it had something special.

IBM became successful beyond its dreams. So successful, in fact, that it soon attracted clonemakers like Compaq and AST Research and Gateway and Dell--as well as a host of others, most of which have either gone out of business or been acquired.

That first PC obviously was limited, but it triggered excitement about the future. IBM had proved that this was going to be a lot more than a hobbyist's affectation. This was serious stuff and the industry soon attracted many of the best and the brightest that the United States (and other nations) could graduate. As far as the future went, the sky was the limit.

Or so we thought before the disappointments set in. Some folks had believed it would be no time before we all were interacting with computers the same way the starship Enterprise crew did on "Star Trek." But that pipe dream is still relegated to the realm of science fiction novels.

Michael Dertouzos, a brilliant computer scientist who directed MIT's Laboratory for Computer Science for more than two decades, famously raised this question in many of his writings. Dertouzos posed the (seemingly) simple question: Why are we still serving the machines rather than the other way around? Despite the advance of personal computer technologies, he was amazed by the laboriousness that still defined the relationship between people and computers. Dealing with computers was still very much a pain in the neck.

The reality is that it's harder to make any of this happen than it is to talk about it. I can't fault the computer industry for sticking with the conventional when that's what pays the bills. "Good enough" has worked out quite well the last 25 years in developing a multibillion-dollar global industry.

But it's too bad Dertouzos is no longer around. With voice recognition improving all the time and the development of wireless technologies and more-powerful mobile devices, Silicon Valley's getting closer to his concept of human-centric computing. It's been a lot like watching grass grow, but we won't need to wait another 25 years to reap the benefits.

At least I hope we don't.