X

Leading scientists gather for computer love fest

From an astrophysicist to leaders of the nation's most hallowed research labs, scientists urge IT workers to continue packing computers with more power.

6 min read
Click here to 
Play

IBM's monitor gets to know you
SAN JOSE, Calif.--You've got a friend in science.

That's the message to the information technology sector from dozens of prominent scientists who have gathered here for the three-day Association for Computing Machinery (ACM) conference. Ranging from an astrophysicist to leaders of the nation's most hallowed research labs, the scientists are urging IT workers to continue packing computers with more power and continue enriching the Internet--regardless of a severe stock market downturn that has plunged the sector into a morass of layoffs, budget cuts and slumping morale.

"In science, computing has already set off irreversible changes...at all scales, from nano to global," said Ruzena Bajcsy, assistant director of computer and information science and engineering at the National Science Foundation. "The avalanche is still growing."

It's no surprise that the ACM conference is a veritable love fest between the scientific and IT communities: The previous ACM conference, a 1997 affair whose theme was "The Next 50 Years of Computing," was also a group hug for computer advances.

But the tone of the 2001 conference, dubbed "Beyond Cyberspace...A Journey of Many Directions," is vastly more boosterish than in 1997. Back then, the stock market was just starting to discover Internet stocks, and technology start-ups were raking in millions of dollars in venture capital funds. Four years later, the mood is dramatically darker.

Moore's Law no more?
In an informal survey of more than 200 conference attendants, most agreed the pace of technological advancements in semiconductors and general computing will not be able to continue scorching along as it did in the late 20th century. Moore's Law--the generally accepted principle that a chip's data density will double approximately every 18 months--will stop working in 2006, the attendants speculated in a spot poll.

That's far less bullish than during the previous ACM exposition. Attendants of the 1997 event said Moore's Law, the 1965 brainchild of Fairchild Semiconductor research director Gordon Moore (who later founded Intel), would continue working through at least 2016. Moore's Law is generally considered to be the linchpin to why the information technology industry makes such rapid advancements--and why a $2,000 computer will cost $900 next year and be obsolete the year after that.

Bob Metcalfe, who invented the Ethernet and moderated the conference, was not the first to speculate the end of Moore's Law. But the scientists who presented research during the conference would hear nothing of a potential slowdown.

They emphasized that the information technology sector--and specifically the Internet--had emerged as one of the most important conduits for the scientific community, providing data-rich insight into discoveries ranging from intergalactic collisions to viral diseases.

"I'm a little scared because so much of what we do depends on research," astrophysicist Neil deGrasse Tyson said of the potential stalling of semiconductor power. "People question whether we need Microsoft Word to run 1,000 times faster. But we all know that the real frontier, the real limits, are to solve problems we can't even think of today."

Tyson, who specializes in star formation models for dwarf galaxies and exploding stars, said it's virtually impossible to overstate the reliance that the astrophysics community has heaped on the technology industry.

For example, the National Academy of Science asked Congress last year for funding to build a National Virtual Observatory (NVO) in which astrophysicists around the world could submit to and receive data from a central database powered by supercomputers and stored on the Internet. The request, as detailed in Astronomy and Astrophysics in the New Millennium, ranked NVO as the most important small initiative in astronomy for the first decade of the 21st century, and it marked the first time the group did not specifically request a brick-and-mortar observatory.

"You just need access to the Internet, then you get your project approval and through the Internet you have your patch of the sky in front of you," said Tyson, who referred to NVO as "the democratization of the universe" because it frees scientists from having to have access to expensive telescopes and other equipment. "We're all very excited about it, but we're still grappling with how to make it work. It's more complicated than the Hubble Space Telescope."

Predicting catastrophes
Oceanographers are also relying increasingly on by-products of the information technology industry, including data-mining tools that help predict weather patterns, volcanoes and earthquakes.

Marcia McNutt, president and CEO of the Monterey Bay Aquarium Research Institute, said data mining could help in the analysis of information that an increasing number of drifting probes are collecting about the ocean and the ocean floor. Eventually, McNutt said, oceanographers will be able to predict with a relatively high degree of accuracy ocean currents, water temperature and the extent of plate shifts and volcanic eruptions underwater, making weather forecasts much more accurate--possibly even predicting patterns for storms that will occur a year from now.

The ability to predict catastrophes will become increasingly important as the world's population migrates to coastal cities. In about 25 years, McNutt said, 75 percent of the global population will be living on or near an ocean's edge--making the effects of typhoons, polar ice cap melting, earthquakes, marine pollution and algae blooms more pronounced on humankind.

Pooling data from oceanographers around the world and using data-mining tools to crunch numbers could work so well that eventually humans may have to determine whether it's appropriate to engage in "marine engineering"--tweaking water temperatures, encouraging algae growth and even tinkering with marine animals' genetics to make the oceans more efficient at producing food and sustaining human life offshore, she said.

"We have to proceed with caution and stop when necessary," McNutt said. "That's the hardest part. Sometimes we develop an entitlement sentiment and can't stop."

Distributed computing
Bajcsy, who recently assumed a position in Washington helping to lobby for the National Science Foundation, said scientists rely on the Internet and the supporting IT industry to facilitate worldwide collaboration.

For example, Bajcsy said, the GriPhyN (Grid Physics Network) collaboration hopes to fuse IT specialists with scientists to uncover new patterns in four confounding physics issues, including the origins of mass and matter at the smallest length scales and the gravitational waves of pulsars, supernovae and in-spiraling binary stars. The goal is to unite thousands of scientists around the globe via distributed, high-bandwidth networks, ultimately resulting in a set of production data grids far more detailed and accurate than any existing research.

Distributed computing uses the CPU power of computers that otherwise goes to waste--not just during the night or when the screensaver switches on, but even between keystrokes. It's not a new idea, as shown by the popularity of SETI@home, a screensaver program that processes radio telescope signals to search for extraterrestrial communications. But the more mainstream scientific community has only recently begun to understand its far-reaching possibilities for data analysis.

Not only does distributed computing allow scientists to tap idle computer time, Bajcsy said, but it also helps unite academics from around the world.

"Not all brains are in Boston or Palo Alto," Bajcsy said, referring to American technology hubs in Massachusetts and California. "They are distributed, and they need to be connected."

Click "start" to finish
But with the gushing praise for the wonders of the technological revolution came a little criticism.

Michael Dertouzos, professor and director of the MIT Laboratory for Computer Science, said the IT industry has failed to create "human-centered computing" and instead requires people to have a relatively high degree of skill in order to perform the most simple digital tasks. For example, he said, computer users must know that to turn off the computer they have to click on "start"--not an intuitive step to end a computing session.

Although the scientific community has learned many computing tricks, revolutions in fields ranging from genetics to astronomy will not occur until the computing industry makes fundamental changes to its machines, he said. Advances in speech recognition software, for example, will open up the Internet to the estimated 2 billion people worldwide who cannot read or write, vastly increasing the size of the Internet and the potential data collected on it.

"We have been building computers for 40 years, but they are not very different at the base level," Dertouzos said. "We're not exploiting this technology revolution. We're hardly scratching the surface."