If you want to see where the technology industry is heading in the next few years, a quick review of the past might be useful. As Amar Bhide of Columbia Business School reminds us in Thursday's Wall Street Journal, the personal computer industry was born in the pain of the 1980s economic recession.
Why then? "History suggests that Americans don't shirk from venturesome consumption in hard times," Bhide writes, suggesting that consumers show a appetite for risk that far exceeds the near-term value they individually derive from things like software and mobile devices, a tendency that is unlikely to abate in our recessed economy. It is this inability to correctly gauge costs that may work in technology's favor, and society's:
Economists regard the innovations that sustain long-run prosperity as a gift to consumers. Stanford University and Hoover Institution economist Paul Romer wrote in the "Concise Encyclopedia of Economics" in 2007: "In 1985, I paid a thousand dollars per million transistors for memory in my computer. In 2005, I paid less than ten dollars per million, and yet I did nothing to deserve or help pay for this windfall."
In fact, Mr. Romer and innumerable consumers of transistor-based products such as personal computers have played a critical, "venturesome" role in generating their windfalls.
Buying something new requires taking risks. Products that work in a lab may not function as well in the real world. Repeated use of a product may reveal flaws that cause malfunctions, increase operating costs, or pose health and safety hazards to the user or the environment.
This time around, we can do one better. Much of the "consumer" experimentation in software will not involve any financial outlay at all. It will, instead, require time and interest, which may well be available in abundance as people spend more time online and less time employed.
And it will be overwhelmingly open source.
Indeed, as Sean Dodson writes in The Guardian, "with money becoming increasingly scarce, and the free alternatives growing in sophistication, free is finally threatening to go mainstream."
Could it be that the 1980s PC revolution will give way to the 2009 open-source revolution?
CNET's Dave Rosenberg suggests that. He's absolutely right.
Here's an illustrative anecdote. Two years ago a Fortune 500 company approached Alfresco (my employer), and ultimately concluded that they were too conservative and open source was too risky to move forward. A week ago the same people contacted me to say that times have changed, and now it's considered too risky to not be using open source due to its potential cost savings.
We may be on the cusp of the next big wave of computing. This time it won't be "personal" computing. It will be social computing. It will be open source.