Upgrade to Apple Watch Series 8? National Coffee Day Fitbit Sense 2 'Hocus Pocus 2' Review Kindle Scribe Amazon Halo Rise Tesla AI Day Best Vitamins for Flu Season
Want CNET to notify you of price drops and the latest stories?
No, thank you

Surrendering U.S. leadership in IT

David Patterson, president of the Association for Computing Machinery, warns that the U.S. could easily forfeit its IT leadership to Asia.

A hearing this week in Washington will determine whether the United States will lead critical IT innovation in the 21st century.

The hearing, to be conducted Thursday by the House Science Committee, will focus on the state of research funding for information technology. Such funding, and the innovation it spurs, is vital to the U.S. economy and national defense.

Some historical perspective illuminates what's at stake. In 1957, the launch of Sputnik by the Soviet Union sent a wake-up call to the U.S. In response, the Defense Advanced Research Projects Agency was created and charged with preventing such technological surprises in the future. DARPA funded high-risk, high-reward research and sought to engage the best minds.

Since then, innovation in U.S. IT has grown substantially under government-funded research and has been critical to this nation's leadership in technology. DARPA, together with the National Science Foundation, funds most academic IT research in this country. In addition to swatting home runs such as the Internet, the majority of IT companies listed on the New York Stock Exchange and the Nasdaq, and the most technically advanced military in the world, IT research has become a key economic driver. In the last decade alone, IT was responsible for 9 percent of the United States' gross national product. In 2001, a National Academies of Science and Engineering report gave 19 examples of IT research leading to industries worth a billion dollars or more. Federally funded academic research played a major role in every case.

Over the last 10 years, however, there's been a major shift in funding priorities and policy at DARPA and the National Science Foundation. The current DARPA policy, which mandates 12-month "go/no go" research milestones for IT, has shortened deadlines, thus discouraging long-term research. In addition, programs formerly open to academics are now classified; other programs have citizenship restrictions. In three years, DARPA halved academic IT research to $123 million in fiscal year 2004. DARPA today is no longer engaging all the best talent in long-term research, which has been so vital to America's prowess in defense and essential to a robust economy.

Ironically, the high tech industry increasingly depends on government-funded research partnerships with academic institutions to spur innovations.
The effects of this significant funding shift are far-reaching and long-lasting. In the last five years, IT proposals to the National Science Foundation jumped from 2,000 to 6,500, forcing the agency to leave many worthy proposals unfunded. Sadly, other agencies are not stepping in to take up the challenge. The Department of Homeland Security, which some hoped would augment the Science Foundation and DARPA, spends just a few million dollars per year for IT research. NASA also is downsizing its IT effort; in March it encouraged all but 70 of its 1,400 employees at its Silicon Valley center to retire.

Nor can we count on the IT industry itself for long-term research investment. Ironically, the high-tech industry increasingly depends on government-funded research partnerships with academic institutions to spur innovations. Those new companies that sprang to life in the recent past--Oracle, Dell, Cisco Systems--have no research labs. And of the established IT companies, only IBM and Microsoft maintain large and growing research arms.

If declining U.S. research funding simply slowed the pace of IT innovation, perhaps the upcoming House Science Committee hearing wouldn't be as critical to the nation as it is to the research community. However, the rest of the world isn't standing still.

Chinese Premier Wen Jiabao recently went to India to propose co-development of the next generation of IT, with China producing hardware and India developing software. He predicted the coming of the Asian century of the IT industry, as both countries strive for worldwide leadership in IT.

The history of IT is littered with companies that lost substantial leads in this fast-changing field. I see no reason why it couldn't happen to countries. Indeed, at the recent International Collegiate Programming Contest of the Association for Computing Machinery, four Asian teams finished in the top dozen, including the champion, while the best U.S. finish was 17th, the country's worst showing ever. If current U.S. government policies continue, IT leadership could easily be surrendered to Asia.

Allow me to suggest two questions for the hearing: Could loss of IT leadership--meaning, for example, that the IT available to the U.S. might be inferior to that of China or India--lead to a technological surprise akin to the one with Sputnik 50 years ago? And, if the U.S. must face serious competition for leadership, isn't it better to attract the best and brightest to U.S. universities to come and work to help grow the American economy, rather than have them innovate elsewhere?