CNET también está disponible en español.

Ir a español

Don't show this again

Tech Industry

Commentary: IBM advances toward autonomic computing

The company makes a reality out of the "2001: A Space Odyssey" fantasy of building self-regulating computer systems using parallel processing.

    The dream of building self-regulating computer systems using parallel processing was not new in the 1960s when "2001: A Space Odyssey" acquainted millions of us with HAL.

    Computer makers have been actively pursuing this dream for decades. Currently, IBM's Research Division is one of the leaders in this field, along with Intel and several others.

    See news story:
    IBM makes declaration of independence for chips, software
    IBM's "autonomic computing" project is based on the concept of SMASH--"simple, many and self-healing." This concept was applied in Deep Blue, the IBM computer that beat chess champion Garry Kasparov at chess a few years ago. IBM's present project, Blue Gene, an MPP (massively parallel processing) computer designed to map genomes, builds on Deep Blue but is larger and more sophisticated.

    IBM is making real advances in both hardware and software. One of the more interesting is its design work developing "computers on a chip" that combine multiple processor cores and memory on a single piece of silicon that then can be stacked and racked to create massively parallel machines. This potentially enables IBM to create parallel systems to order by combining enough processors to meet an individual customer's needs.

    The real challenge: software
    But if the hardware area poses interesting challenges, the real area that limits efforts to get these processors to work together is software. If IBM can make some real advances in the areas of dividing up tasks, self-healing and controlling the interactions among the processors to better approach the function of the human brain, the result will be some very important contributions that will eventually affect how computing is done.

    Meta Group views parallel processing as part of a larger body of processing concepts that we call "federated computing." This also includes peer-to-peer processing (as done by Gnutella and others), instant messaging and other systems that enable computers to work together without a server in the middle.

    The key to benefiting from federated computing scenarios is understanding the types of work that they do well. Federated computing approaches are useful in several situations:

    • Complex problems that can be easily broken up into discrete tasks, which can be farmed out to multiple machines

    • Simple tasks that must be done many times

    • Searches across many machines for a particular piece of information

    The work in IBM's laboratory is still in the advanced research stage and is unlikely to affect corporate or government computing in the near future. Its first practical applications will be in the traditional supercomputing areas, such as genetics research and weather forecasting.

    It also will have applications in drug research--for instance, modeling drug interactions in the complex environment inside the human body--and in entertainment, whereby motion picture studios use massive amounts of computing power to generate realistic 3-D simulations.

    Depending on how far IBM can advance the software, its applications may be limited, as today's parallel processing computers are, to those that can be easily broken into many small tasks that can be done simultaneously.

    Meta Group analysts Dale Kutnick, William Zachmann, Val Sribar, Jack Gold and David Cearley contributed to this article.

    Visit Metagroup.com for more analysis of key IT and e-business issues.

    Entire contents, Copyright ? 2001 Meta Group, Inc. All rights reserved.