Although it is not open to the press this year, Microsoft is using this week's TechFest internal science fair as an occasion to talk about some of the work it is doing to find new ways of connecting with computers.
Some products have already come to market, such as the multitouch features of Windows 7, while others are close, such as the. But Microsoft sees those as just the beginning of a world in which the way that we interact with computers is fundamentally altered.
"The transition to a natural user interface will change everything from the way students write term papers and play computer games to how scientists study global population growth and its impact on our natural resources," Microsoft's chief research and strategy officer, Craig Mundie, said in a statement.
This week, Microsoft is highlighting two of the big shifts. First, there is the way that computers collect input. That's shifting from a world dominated by the keyboard and mouse to one in which information is gathered from gestures, touch, sensors, and other input mechanisms, alongside traditional means such as the keyboard.
Among the things Microsoft is showing at TechFest is a project that could take the air guitar into reality, allowing one to use her arm as the input device. But, while playing Guitar Hero without a guitar is interesting, such research has broader implications. Think, for example, of how nice it might be to be able to use one's arm whenever one needed a keyboard bigger than the one that fits on a cell phone.
Microsoft also showed a "Mobile Surface" that combines the gesture recognition of Natal with the concepts in the Surface tabletop computer to create a new kind of mobile device.
Another effort, called Project Gustav, shows how generic computer inputs, like a stylus, can start to take on more of the attributes of real world objects. For example, digital pens have long been able to create digital ink on a page, and even imitate pressure sensitive tools like pens or airbrushes. But Gustav goes a step further, using software to make a digital canvas and palette that can mimic the way real world objects interact and blend their properties.
The more subtle shift, though potentially just as important, is the move from computers acting when we tell them to, to computers acting on our behalf, based on preset criteria. There has been talk of so-called computer "agents" for eons, but advances in computing power, particularly multicore processors, are starting to allow the reality to more closely map to the hype.
Here's a video of Microsoft Research chief Rick Rashid putting TechFest in context and showing a couple of demos. (Note: To view the video, you'll have to first download Microsoft's Silverlight.)