At the forefront of this movement is Professor Nicholas Negroponte, founder and former director of the MIT Media Lab. His not-for-profit One Laptop Per Child (OLPC) project has been developing a laptop (targeted at $100 but currently struggling to break $200) suitable for use by every child in the developing world. Recently, and will even contribute funding to the project.
Helping people in the developing world cross the digital divide is a fundamental act of decency, generosity--and even self-interest--as these new markets grow, consumers spend and productivity surges. The need for technology among the underserved is so urgent, hopeful thinking goes, that even a computer with no commercial viability--no distribution channels, maintenance, training, programming services and in fact virtually no IT ecosystem at all--can meet that market's need.
As laudable as this dream is, the ideal unfortunately runs counter to a fundamental fact of life: a computer cannot exist independent of basic economic realities.
A computer is, rather, a creature of connectivity and collaboration. And, given the economic realities in the developing world, $200 computers can not generate the profit essential for the creation of a robust IT ecosystem, essential to ensure successful deployment, ongoing operation and maintenance.
The price of a base-level personal computer today is about $400. That hasn't changed much in the last ten years, although the power this computer delivers has increased profoundly. As a result, however, the world computer user base has been stuck at a largely saturated 850 million users for years. Unfortunately another billion potential users--most in developing and underserved markets like education--cannot afford the requisite $400. If we can merely squeeze down the price tag, have we solved their problem?
Only if you believe that OLPC and Intel's $200 laptop, with their PDA-like, seven-inch screens and obsolete processors are the answer. But the developing world is not just "village kids," but rather motivated, ambitious people engaged in business, agriculture, commerce, health care, finance and education.
For PCs to be productive in this business and educational landscape, they require a host of supporting services, plus reasonable features and capabilities. A PC must communicate, which mandates connectivity. That, in turn, demands configuration, maintenance, professional services, technical support, hardware and software upgradeability. Without a healthy ecosystem, a PC is not worth even $200.
Here in the developed world, the PC hardware makers have put up with profitless computing for years as a result of operating in a saturated, upgrade-driven market. We know our industry is in sick condition and we have now driven down the cost of "real PCs" as far as they can go.
However, not everyone needs their own PC. What they do need is access to the functionality and benefits that the PC provides, delivered in an affordable and efficient way. That's where I believe multi-user computing fills the void.
This multi-user model is not new. During the 1960s, when computers were all mainframes and cost millions, multi-user computing, in the form of time sharing (where we rented access by the hour using low-cost "dumb terminals") was our first tool for expanding the market from the "Fortunate 500" to the rest of us. This model continued through the 1970s with $100,000 and ultimately $10,000 minicomputers further expanding the market. In the 1980s came the PC and the world changed; ultimately, we all got our own computers.
Although the last ten years have seen very little movement in the price of low-end PCs, technology advances have turned the 2007 entry-level PC into a very muscular piece of technology whose gigapower is more than 1,000 times that of a $400 box built in 1998. Only a fraction of today's PC users, such as computational scientists, extreme gamers, graphic artists and industrial designers use more than a few percent of what these mainframes on a desk can offer.
As a result, the vast majority of those CPU cycles are wasted, burning energy (150 to 200 watts per box), costly and scarce in these markets and becoming ever more costly to own. So why not harness and share this extra capacity and resurrect these proven techniques and technologies from the past to take today's "mainframe on a desk" and put its power to work?
Enterprise computer users have been benefiting from the PC version of multi-user computing since 1990, which our industry has dubbed "server-based computing." Blade computing and virtualization are the latest twists on this same multi-user concept.
However, these enterprise software and hardware components are expensive. The software licenses alone often add up to more than the cost of the full or stripped-down PCs being used as the access terminals. These terminals (thin clients) are themselves as expensive as low-end PCs. It has been, thus far, a technology for the rich and fortunate.
A number of new firms, including my own company, NComputing, have reincarnated the thin client with non-CPU-based access terminals. Access terminals are being built today at costs as low as $11 and sold for well under $100 per user. At the same time, they provide manufacturers, distributors, resellers and maintenance partners with full commercial margins. The expensive software and high-end servers have been replaced by low-cost or free software and desktop PCs. These multi-user environments tap the power of low-end PCs to support 10 or more concurrent users with power consumption of under 6 watts per user.
All the evidence undercuts the widespread technology assumption about how best to liberate emerging regions of the globe from the energy-wasteful business model which is being foisted upon them today.