During the past 50 years, it's been easy to pat ourselves on the back and look at a proud history of "next big things" that include data processing, automated payrolls, the personal computer, the graphical user interface, the Web and e-business.
An impressive heritage. But it won't do much good given where we've ended up.
Follow the evolution of computers from single machines to modular systems to personal computers networked with larger machines, and you see the growth of sophisticated architectures governed by software whose complexity now routinely demands millions of lines of code.
The Internet has added yet another layer of complexity by allowing us to connect--some might say entangle--a world of computers with telecommunications networks. In the process, computing systems have become increasingly difficult to manage and, ultimately, for customers to use.
In fact, this growing complexity of the IT infrastructure threatens to undermine the very benefits information technology aims to provide. Up until now, we've relied mainly on human administration to manage this complexity.
Unfortunately, we are starting to gunk up the works.
Like Charlie Chaplin falling into the machine in "Modern Times," the people responsible for the smooth running of these interconnected systems are starting to get in the way. Ironically, it will take higher levels of complexity to solve this problem.
But not to worry, the inspiration for solving it can be found in one of the complex systems of the human body, the autonomic nervous system. I'd argue that the grand challenge facing the industry now is an entirely new approach to IT building systems: autonomic computing.
The beauty of autonomic computing--and its biological inspiration--is that all of the complexity gets hidden from the user. Consider the autonomic nervous system: It tells your heart how many times to beat, checks your blood's sugar and oxygen levels. It monitors your temperature and adjusts your blood flow and skin functions to keep it at 98.6 degrees. But most significantly, it does all this without any conscious recognition or effort.
A new page in computing
It's time to design and build computing systems like that: capable of running themselves, adjusting to varying circumstances, and preparing their resources to handle most efficiently the workloads we put upon them. More specifically, autonomic computing will encompass these key functions:
First, to be autonomic, a system needs to "know itself" and compromise components that also possess a system identity.
An autonomic system never settles for the status quo; it always looks for ways to optimize.
An autonomic system must configure and reconfigure itself under varying and unpredictable conditions.
An autonomic system must be its own doctor; it must be able to recover from routine and extraordinary events that might cause some parts to malfunction.
An autonomic computing system must be an expert in self-protection.
An autonomic computing system knows its environment and the context surrounding its activity, and acts accordingly.
An autonomic system cannot exist in a hermetic environment and must adhere to open standards.
An autonomic computing system will anticipate the optimized resources needed to meet a user's information needs while keeping its complexity hidden from the user.
If systems and networks begin to adopt these attributes, IT professionals will be able to work at a higher level. Imagine not having to worry about making sure that mundane, complicated tasks are handled correctly by humans--repairing root causes of failure, bringing servers to "dry-dock" or resource allocations.
This is the ultimate benefit of autonomic computing: systems that tackle the complexity "under the covers" and freeing IT professionals to drive creativity, innovation, and opportunity. Entire new business models can emerge, and one of the best early examples is the delivery of IT as a utility-like service over the Internet.
Work already underway
The good news--work is already being done to make this happen. Things we've learned in key research areas such as artificial intelligence, control theory, catastrophe theory, as well as some of the early work done in Cybernetics give us a variety of approaches to explore.
Current research projects at labs and universities include self-evolving systems that can monitor themselves and adjust to some changes; "cellular" chips capable of recovering from failure, heterogeneous workload management that balances and adjusts workloads of many applications over various servers, and traditional control theory applied to the realm of computer science, to name just a few.
But this is not enough.
One of the primary reasons for today's complexity is overspecialization. For decades, the industry has worked to solve problems at a micro level, rather than taking a holistic view. Autonomic computing is an attempt to shift gears and make sure "smaller, faster, cheaper" is no longer pursued in isolation, but in the context of making systems work better and smarter.
No company, no university, no lab can do this alone. We're calling on our academic colleagues to drive exploratory work in this area, and we're committing research funding. IBM will open up a substantial grant program with more than 50 grants available to support this effort. Ideally, these academic pursuits will lead to the formation of consortia, conferences and other institutions to advance the science required to tackle this challenge. Also, there must be cross-industry participation to ensure that these grassroots efforts are supported with additional funding and the adoption of necessary standards.
It's time to hide the complexity to reduce our skills shortage and to bring back the value of IT. And the stakes are high. If we don't meet this challenge as an IT community and do not solve it, we won't have any more "next big things."