X

Still waiting for the next big thing?

CNET News.com's Charles Cooper writes that you'll have better luck waiting around for Godot to show up. Instead, he says, it's time to rethink the conventional wisdom about big technology advances and how to gauge progress.

Charles Cooper Former Executive Editor / News
Charles Cooper was an executive editor at CNET News. He has covered technology and business for more than 25 years, working at CBSNews.com, the Associated Press, Computer & Software News, Computer Shopper, PC Week, and ZDNet.
Charles Cooper
4 min read
I recently reconnected with an old source who had the good fortune to take his winnings off the table before Silicon Valley imploded.

And because he doesn't need to rush back to the rat race anytime soon, he's been kicking tires--make that kicking lots of tires--as he assesses his next move during a leisurely interregnum. So it was that we wound up talking about what might be on the horizon--more specifically, he wanted to know what will qualify as the computer industry's next big thing?

Good question. In fact, it's an industry perennial--especially during the run-up to major trade shows like the fall Comdex show next month--and so I suppose this is probably as good a time as any for next-big-thing ruminations. But the more I've thought about it, the more I've wondered whether his question leads to an even larger one.

To be sure, the computer industry has brought forth many Wow! innovations, starting with the personal computer and jumping right up to the present and the Internet, which--dot-com bubble bursts notwithstanding--is helping to transform modern communications and commerce. But if you pause to examine the intervening 20 years between those two bracketing developments, is the time line really punctuated by a regular series of earthshaking technology advances? I think not.

In fact, the common interpretation of the development of computer technology over the couple of decades since the advent of the IBM PC turns out to be based more on popular myth than an actual accounting of history.

The IBM PC and the Internet
The clip morgue shows that Apple, Commodore and a host of small computer makers handily beat Big Blue to market by several years. Yet it was not until 1981, when IBM introduced its own personal computer, that the PC era began in earnest. By any measure, this was the catalyzing event that legitimized the personal computer as a productivity tool for business.

That product debut subsequently spawned an industry of peripheral and software companies, which grew up around the IBM PC. In relatively no time at all, so-called clone manufacturers, like Compaq Computer, Gateway and Dell, emerged to do IBM one better by turning out computers that ran faster or sold for less money--or sometimes both.

Throughout the rest of the decade, PC technology advanced in fits and starts. Systems became faster, and operating systems got less clunky. CD-ROM and sound technology slowly faded up. Improving software functionality extended from clients to servers, setting the stage for companies to organize and distribute data in exciting new ways.

But did any of these advances qualify as next big things? Not compared with the rollout of the PC, an event that had a multibillion-dollar multiplier effect on the international economy.

For most of the last 20 years, the standard refrain after returning from Comdex was that the cab lines were long and the parties were grand. As for what was new and groovy on the tech front: More often than not it was just the same old, same old. That may have been the word from jaded insiders, but it also suggested a more basic truth about the incremental nature of progress in the computer business, one that stood the test of time until 1994 when the Internet began to capture the popular imagination.

But after a five-and-a-half year run, even the Internet "revolution" is on the verge of getting stale. In the aftermath of the Internet bubble burst, the extreme I-told-you-so pronouncements about how so much of this was just a highfalutin game of three-card monte are coming fast and furious. Of course, that's as sensible a view as the preceding period's irrational exuberance. Fact is, the pendulum swings hard in both directions, and the momentum easily carries some people away.

The more interesting question to ask is how we go about making the Internet more viable.

Developers to the rescue?
In our fixation on the next big thing, it's easy to overlook the fact that most of the industry's progress is actually the result of a series of small steps, be they the adoption of something as seemingly insignificant as the SOAP protocol (which is actually very significant) or an agreement over the use of XML (also a big step forward for Internet-based communications and commerce).

The term "iteration" may not be especially sexy, but finding a smarter, faster, cheaper or quicker way of doing what's already being done is a huge boon to the regular folks who depend on this stuff to do their jobs. Paul Horn, a top executive at IBM Research, who recently wrote in this space that the next challenge for the computer industry is to design and build computing systems that can run themselves, "adjusting to varying circumstances, and prepare their resources to handle most efficiently the workloads we put upon them."

If he's right--and I think he is--the legion of small developers who do the grunt work figuring out the right patterns of 0s and 1s are going to be the ones to shoulder most of that burden. Toiling away by themselves or in small groups, they know better than most that there are no quick fixes.

By themselves, their work may not qualify for page-one treatment; taken together, the accumulated impact of their labor is huge, something along the lines of a next big thing.