X

Why computer design must change

Greg Papadopoulos, CTO of Sun Microsystems, says current events will force the industry to rethink basic assumptions that have guided the development of computer systems the last two decades--or else!

3 min read
The sheer scale of the challenge is like nothing we've faced before. But it's not as if the high-tech industry is looking up at a rock wall like El Capitan or the famed North Face of Mount Everest. This is worse--it's like preparing to scale a tidal wave.

The wave analogy is apt, I think, because the Internet has been coming at us in waves. We create the waves because we want to ride them, like surfers shooting through the Banzai Pipeline, but our skills are about to be severely tested.

The first wave was a network of computers that swelled to encompass hundreds of millions of systems, all connected, all continually exchanging data.

The second wave, the one we're riding now, could be described as a network of things that embed computers. It's made up of wireless phones, two-way pagers and other handsets, game players, teller machines, and automobiles. In short, billions of potential connections.

The third wave is on the way, and even as we create it, we need to prepare ourselves; it's shaping up to be a regular tsunami. I call it a network of things. Trillions of things. Things you'd hardly think of as computers. So-called sub-IP (Internet Protocol) devices such as light bulbs, environmental sensors and radio-frequency identification tags.

In the first wave, the Internet connected machines, and people tapped in when they could. The second wave is making the connection more or less continuous. The third promises to make it virtually indistinguishable from the various aspects of daily life.

If we don't rethink the way we design computers...we may find ourselves eating sand when the next wave hits.
One of the interesting side effects of the next wave will be a shift in the flow of information. Right now, most information flows outward from the Internet. In the future, more will flow back to the data center, much the way a pit crew gathers telemetry from a racecar today.

Soon we'll be able to add miniature ID tags to all kinds of products, so we can instantly discover not only the price of the item, but where and when it was made, how it was delivered, and a host of other useful data. We'll be able to track in-transit temperature fluctuations that can affect the expiration date of a medication or the freshness of food.

Such tags cost about 50 cents to make; as the cost comes down, we'll begin to see them on more and more items. Each will have its own digital signature and a presence in cyberspace, which promises to enhance the efficiency of the whole supply chain, from maker to user to recycler.

This is science, not fiction. But the unprecedented scale of this third wave--and the incredible diversity of devices and interfaces--makes the challenge quite obvious: How on earth is anyone supposed to manage all this?

If we don't rethink the way we design computers, if we don't find new ways of reasoning about distributed systems, we may find ourselves eating sand when the next wave hits. Over the past couple of decades, the definition of a system has remained constant amid rapid refinements. The components have always included microprocessors, disks, memory and network input/output. In the decades ahead, things will look much different: Computers, storage systems and IP networks will be the components.

We must learn--and are in fact learning--to virtualize the elements of the network to create a single pool of resources that can be dynamically allocated.
We must learn--and are in fact learning--to virtualize the elements of the network to create a single pool of resources that can be dynamically allocated, matching resources to services on the fly. This will enable us to automate change management, reduce complexity, better utilize resources, and lower total cost of ownership.

What's more, virtualization will make it possible to take distributed applications from concept to wide-scale deployment far more quickly. Just carve out the resources you need and light up the application. It should be no more difficult than running an application on a single computer today.

With this new architecture, computers won't attach to networks; they will be built from networks. This shift enables radically higher-scale microprocessors, exabytes of storage, terabits of bandwidth and billions of IP connections--all of which will be imperative as we move into the Net's third wave.