Yellowjackets finale recap Xbox Series S is on sale At-home COVID tests N95, KN95, KF94 masks NFL Super Wild Card Weekend: how to watch Navient student loan settlement

Questioning the next big thing

A reader writes that for some of the more attractive approaches to advancing computing, we will have to do some serious thinking.


Questioning the next big thing

In response to the Oct. 19 column by Charles Cooper, "Still waiting for the next big thing?:"

Your point about the "next big thing" is correct.

We can also suspect that the popularity of the "next big thing" came from a "theory" about technology VC and IPO investing: that we would get "another Microsoft, another Cisco," and that we could move from those to an ongoing sequence of next big things.

Yes, this attitude was based mostly on bubble blowing, charging herds and financial fads. And your point that the developers know better is partly correct.

Once there is a success like Microsoft, the investment banks can say "another Microsoft" until the individual investors see too many resulting flops. Then we need another big success.

My views:

Paul Horn's "autonomic" computing is not good. Sure, what he lists is a contribution to a checklist for good design features that may be worth including in some particular cases, but he leaves out far too much and, thus, does not collect ideas for a new direction that will be effective.

Horn's view concentrates on the final "solution" as might be seen by the customer's chief information officer or chief executive. Why are we not surprised to see IBM do such a thing?

But independent of how well such customer executives might like the result, or how much IBM likes to concentrate on such potted and sealed "solutions," the real challenges are elsewhere.

Horn is roughly correct in saying that computing is facing some challenges that could affect its growth. We can suspect that further progress in computing is proving too difficult for the pragmatic cowboy-coder approaches of the recent past.

But Horn essentially avoids any consideration of how we will make a technically feasible and economically attractive development of systems that have additional desirable features, "autonomic" or otherwise. Here is where the challenges are, and Horn says nothing about them; he doesn't even pose the questions, much less provide solutions.

Horn's idea looks like another drop in a steady stream of publicity.

For some of the more attractive approaches to advancing computing, we will have to do some serious thinking. There is a "next big thing" on the horizon: using the Internet for high quality real-time digital motion video for conversations and conferencing.

Norman B. Waite
Wappingers Falls, N.Y.