X

Ghosts in the machine: A review of Nick Carr's The Big Switch

Nick Carr paints a frightening vision of the future, but is it true?

Matt Asay Contributing Writer
Matt Asay is a veteran technology columnist who has written for CNET, ReadWrite, and other tech media. Asay has also held a variety of executive roles with leading mobile and big data software companies.
Matt Asay
5 min read

I figured I knew Nick Carr's central thesis behind his new book, The Big Switch: Our New Digital Destiny, before I started. I've read Nick's blog religiously for years and was fortunate to have him keynote last year's Open Source Business Conference.

The thesis runs something like this: IT didn't used to matter very much because interchangeable software systems widely used throughout industries means IT no longer provides a basis for competitive differentiation (see pages 56-57). In the next phase (dubbed "utility computing"), traditional IT matters even less: data centers are the new utilities, allowing more efficient deployment of software applications than any one company could hope to build on its own. Jack into the network of data and services and get on with your business.

What I wasn't anticipating was where such nonchalance could lead socially. This comes in the second half of the book, and left me wishing that Nick's arguments weren't so lucidly advanced. It would have been nice to caricature his argument and move on. Unfortunately, I'm not sure that's possible.

But first Utopia, before further discussion of Carr's Dystopia.

Using the history of Burden's waterwheel, Edison's discoveries and inventions, and Insull's electric power utility as backdrops, Nick writes of the rise of the coming information technology (IT) utilities:

[The] fragmentation [of IT into separately installed and managed systems that individual corporations own] is wasteful. It imposes large capital investments and heavy fixed costs on firms, and it leads to redundant expenditures and high levels of overcapacity, both in the technology itself and in the labor force operating it. The situation is ideal for the suppliers of the components of the technology--they reap the benefits of overinvestment--but it's not sustainable. Once it becomes possible to provide the technology centrally, large-scale utility suppliers arise to displace the private providers. It may take decades for companies to abandon their proprietary supply operations and all the investments they represent. But in the end the savings offered by utilities become too compelling to resist, even for the largest enterprises. The grid wins. (16)

A utopian vision of the future, right? Well, not really. As Nick writes, we're actually knee-deep in the future of utility computing already, as Napster, Google, and other web services demonstrate.

More interestingly, Nick suggests that these changes are not voluntary so much as economic and societal in nature. The waste inherent in single-purpose servers and personal computers stems more from the conflict between two Intel executives: Moore's Law that indicates an extremely fast rate of improvement in processing capacity and Grove's Law that suggests that bandwidth between machines will improve at a much slower rate. (58-59)

Dramatically increase bandwidth, however, and suddenly the Internet can truly become the computer, as processing can happen efficiently "in the cloud" rather than being relegated to the client. As Eric Schmidt declared as CTO of Sun Microsystems:

When the network becomes as fast as the processor, the computer hollows out and spreads across the network. (60)

This has been happening at a rapid pace. Unfortunately for the Old Economy software providers, there is little need for them in this New Economy computing model, which has Microsoft and its ilk scared witless. (67-68) There's no money in selling individual bits into a collective cloud built from open-source software.

One problem with this vision, however, is that there may also not be any people. "The Internet is the wave of the future. Just don't try to get a job in it" quotes Nick from New York Times writer Floyd Norris (135). The web obviates the need for high operational headcount, as Google, Flickr, and other web services demonstrate. The greater efficiency the Internet provides - given that its applications are are software, not people - may displace millions (upon millions) of workers.That's progress, you say? Perhaps. Nick writes:

Whereas industrialization in general and electrification in particular created many new office jobs even as they made factories more efficient, computerization is not creating a broad new class of jobs to take the place of those it destroys. (136)

The "computer" is being hollowed out, and that computer appears to be us.

Again the refrain, "Who cares? This is progress! Society will realign itself and new jobs will be created." Perhaps. But Nick points out that at least some of the jobs are being replaced by amateur, work-for-reputation or fun "community members" who remove the ability for newspapers, for example, to fund "hard journalism" since it won't generate page views that are optimal for pay-for-click advertising (155). "We may find," he writes, "that the culture of abundance being produced by the World Wide Computer is really just a culture of mediocrity--many miles wide but only a fraction of an inch deep" (157). YouTube, anyone?

I've noted this phenomenon myself on this blog. The most pageviews go to lowest-common denominator, sensationalist posts. These aren't the posts I really want to write and, given that I have a day job, I think I do a decent job of avoiding them. But it's frustrating to see people glom onto the superficiality of some posts when others convey real information about how open-source companies and communities can operate and the market effects thereof which is, after all, the primary purpose of this blog.

Finally, Nick calls out the fantasy that somehow the web liberates us. It can have this tendency, but it also leads to greater control from governments and corporations, a fact not helped by Google's apparent belief that we should somehow become mere accomplices to the great god Google, an artificial intelligence "being" with which we will be fortunate to embed ourselves (212-213). I don't want to be Google'd. I, along with Bill Gates, prefer to keep my computer "over there" (215), with me firmly "over here."

Even this is a myth, however, with my brain constantly plugged into my Blackberry, this Mac, etc. Perhaps I'm already assimilated.

Which is the fear with which Nick's excellent book ultimately left me. He paints an inexorable and complacent march into the cloud, a march dictated by all the things I hold dear as a laissez-faire capitalist: efficiency, customer service, etc. I want those things, but I really don't want this byproduct of centralized control.

Is there an alternative? I'd love to get others' perspectives after they've had a chance to read the book. Should I have put the book down, relieved to see the Internet optimizing business processes? Or should I be frightened by the control such optimization implies?