The technology giant is launching a major push within the industry and academia toward autonomic computing, the science of creating computing systems that can configure, tune and even repair themselves.
Under autonomic computing, which is also called holistic or introspective computing, databases will continuously re-examine query routes for more efficiency. Worldwide computing grids will move data along so that files can follow an executive going from New York to Paris.
"We need a higher-level framework for the components to fit together in a higher-level way," said Alan Ganek, vice president of autonomic computing at IBM. Ganek spoke at the company's Almaden Research Center here, during a three-day conference dedicated to the new computing concept. "Systems have to work together, and they have to work together in a way that manages itself for (our) benefit," he said.
The term "autonomic" comes from an analogy to the autonomic central nervous system in the human body. When people run, Ganek noted, they don't think about opening their pores or elevating their heart rate. It just happens.
Though these systems will be designed to be less error-prone, they'll also be built with the expectation that errors will invariably occur.
People "have memory failures all the time, but we continue to operate fairly well," joked John Hennessy, president of Stanford University and a founder of chip-design company MIPS Technologies.
Out with the old black magic
The autonomic thrust largely grows out of the looming, and somewhat inevitable, shortage of people trained to manage computer systems. The Internet is simply becoming too large and complex for the world's technicians to manage.
"It is a black magic on how databases are tuned," Surajit Chaudhuri said at the conference. Chaudhuri heads up the data management and exploration group at Microsoft. "It is tough to ship a tuning guru with every database."
Costs are also rising. In the 1990s, approximately 80 percent of the cost of major computer systems revolved around hardware and software acquisitions, according to IBM studies. Now the human expenses are roughly equal to equipment costs. If nothing changes, the human costs will double that of equipment in five to six years, Ganek said.
In IBM's view, autonomic computing systems must follow four principles. They must be self-configuring (able to adapt to changes in the system), self-optimizing (able to improve performance), self-healing (able to recover from mistakes), and self-protecting (able to anticipate and cure intrusions).
Michael Franklin, an associate professor at the University of California at Berkeley, is performing research on adaptive data flow in databases. Roughly, these databases seek to determine the most efficient way to answer a query on the fly by, among other techniques, refining the query to ease traffic on a router. In turn, routers can subsequently "reward" the database by giving it priority on subsequent occasions.
In practice, though, building these systems won't be easy, Hennessy stated. Although computers are human inventions, we still don't completely understand them. Engineers don't honestly know how a chip or computer will function until it's complete. Additionally, software doesn't appear to be improving over time, judging by the recurring number of problems.
"We don't understand something very fundamental about how we build systems," Hennessy said.
At the same time, consumers have become increasingly intolerant of failures while placing ever-greater demands on the technology they use.
"Imagine how life would be if your car crashed as much as your PC," Hennessy said. "It simply wouldn't be tolerable. Access to services is the killer application, and ensuring that services are available is the key metric. It is going to require hardware people to work much closer with operating-system people, with software people and with network people."
In some ways, research on holistic systems could resemble how the airplane and the RISC (reduced instruction set computing) processor developed, said Microsoft's Chaudhuri. In those industries, individual companies would invent their own ways to tackle a problem, and the public and other researchers could then compare the results.
Any new features that come about as a result of this process will also have to be examined in terms of negative side effects they might have on reliability.
"We've got to think really hard before we introduce" new features, Chaudhuri said. "Featurism hurts self-tuning. If you have too many variables, it is harder to learn."
IBM's internal efforts
IBM, Ganek added, takes autonomic computing fairly seriously. Substantial benefits from autonomic research should begin to reach the market in the next few years. Last year, IBM used its Almaden conference to highlight nanotechnology, which revolves around building chips out of molecules. Benefits from nanotechnology will appear far later in time. Research on autonomic computing is being conducted at a wide variety of universities. Researchers from Cornell, Columbia, Stanford, Berkeley and NASA made presentations at the conference.
It was just this past February that Big Blue appointed Ganek to take over the newly created autonomic computing group, and the team of researchers is being formed now. Once complete, the group will work with the gamut of IBM's product and research divisions.
Some steps toward autonomic computing have already taken place. Servers emerging out of IBM's, for instance, notify administrators of pending equipment failures. The company's Tivoli Risk Manager has also been tuned to use sensors to sniff out unwarranted intrusions.
Still, companies now have to tackle the larger job of ensuring that the autonomic features of these individual products can work together.