The hearing, to be conducted Thursday by the House Science Committee, will focus on the state of research funding for information technology. Such funding, and the innovation it spurs, is vital to the U.S. economy and national defense.
Some historical perspective illuminates what's at stake. In 1957, the launch of Sputnik by the Soviet Union sent a wake-up call to the U.S. In response, the Defense Advanced Research Projects Agency was created and charged with preventing such technological surprises in the future. DARPA funded high-risk, high-reward research and sought to engage the best minds.
Since then, innovation in U.S. IT has grown substantially under government-funded research and has been critical to this nation's leadership in technology. DARPA, together with the National Science Foundation, funds most academic IT research in this country. In addition to swatting home runs such as the Internet, the majority of IT companies listed on the New York Stock Exchange and the Nasdaq, and the most technically advanced military in the world, IT research has become a key economic driver. In the last decade alone, IT was responsible for 9 percent of the United States' gross national product. In 2001, a National Academies of Science and Engineering report gave 19 examples of IT research leading to industries worth a billion dollars or more. Federally funded academic research played a major role in every case.
Over the last 10 years, however, there's been a major shift in funding priorities and policy at DARPA and the National Science Foundation. The current DARPA policy, which mandates 12-month "go/no go" research milestones for IT, has shortened deadlines, thus discouraging long-term research. In addition, programs formerly open to academics are now classified; other programs have citizenship restrictions. In three years, DARPA halved academic IT research to $123 million in fiscal year 2004. DARPA today is no longer engaging all the best talent in long-term research, which has been so vital to America's prowess in defense and essential to a robust economy.
Nor can we count on the IT industry itself for long-term research investment. Ironically, the high-tech industry increasingly depends on government-funded research partnerships with academic institutions to spur innovations. Those new companies that sprang to life in the recent past--Oracle, Dell, Cisco Systems--have no research labs. And of the established IT companies, only IBM and Microsoft maintain large and growing research arms.
If declining U.S. research funding simply slowed the pace of IT innovation, perhaps the upcoming House Science Committee hearing wouldn't be as critical to the nation as it is to the research community. However, the rest of the world isn't standing still.
Premier Wen Jiabao recently went to to propose co-development of the next generation of IT, with producing hardware and India developing software. He predicted the coming of the Asian century of the IT industry, as both countries strive for worldwide leadership in IT.
The history of IT is littered with companies that lost substantial leads in this fast-changing field. I see no reason why it couldn't happen to countries. Indeed, at the recent International Collegiate Programming Contest of the Association for Computing Machinery, four Asian teams, including the champion, while the best U.S. finish was 17th, . If current U.S. government policies continue, IT leadership could easily be surrendered to Asia.
Allow me to suggest two questions for the hearing: Could loss of IT leadership--meaning, for example, that the IT available to the U.S. might be inferior to that of China or India--lead to a technological surprise akin to the one with Sputnik 50 years ago? And, if the U.S. must face serious competition for leadership, isn't it better to attract the best and brightest to U.S. universities to come and work to help grow the American economy, rather than have them innovate elsewhere?