PlanetLab is an experimental network that sits on top of the Internet that will allow researchers and others to test and build applications that can essentially span the globe.
Work accomplished at PlanetLab is expected eventually to permit sites to broadcast video from computers located around the world in a coordinated fashion to swarms of users simultaneously without bogging down access. Similarly, virus hunters should be able to detect the spread of new viruses or denial-of-service attacks early.
"Our goal is to provide a playing field for people to try their ideas on," said Dave Culler, a U.C. Berkeley professor and manager of Intel's research "" associated with the university. "In the future, applications will spread themselves over a large fraction of the planet."
Although many take worldwide communications for granted, the Internet isn't the stable, instantaneous network it often appears to be. Servers can crash. Excessive router hopping.
The problems will likely only get worse as new applications and services go online. Corporations are shifting crucial applications to the Internet, and employees are bound to experience slow access times and security problems.
One way around those issues will be to physically distribute applications to a wide variety of computers, which is where PlanetLab comes in. When completed, the network will consist of 1,000 servers geographically dispersed around the world.
Participating researchers will then be able to use slivers of the network as a test bed for optimizing applications to run on multiple
Researchers try to develop applications with these issues in mind, but testing their ideas remains difficult and expensive in practice.
"There is a very high barrier to entry. How do you get (an application) across 1,000 machines to test?" said Larry Peterson, a professor at Princeton and one of the core designers of the project. "Simulation and emulation doesn't cut it."
Creating a network segregated from the mass of the Net also allows researchers to examine solutions for structural problems with the Web itself. "The Internet has become more rigid and far more brittle," Culler said. "That structure limits how much you can morph or change...Typically, applications are built on a few massive servers."
The network, which was conceived in March 2002, consists of 160 computers dispersed to 65 sites in 16 countries. The machines run a modified version of Red Hat's Linux. The consortium expects to have 300 machines running by the end of the year. The full 1,000-computer network is expected to be complete in a few years.
Ninety-five active research projects are using the network, including Codeen, a project at Princeton geared to improving content delivery, and Sophia, another Princeton project seeking to refine search over distributed computers. NetBait, an Intel project, seeks to detect bugs better. Some of these projects may become integrated eventually into the basic structure of the Internet in the future, Culler said.
"We are going to discover a common set of services that other services can run on," he said.
Other participants include Hewlett-Packard Labs, the Massachusetts Institute of Technology, Harvard University, Cornell University, Rice University, and universities in Israel, China, England, Sweden, Taiwan and Germany.