CNET también está disponible en español.

Ir a español

Don't show this again

Christmas Gift Guide
Tech Industry

Silicon Valley soothsayer

It's hard to resist calling Paul Saffo a futurist when he plays the part so well. This afternoon, he's finishing up a speakerphone conversation with a pleasant-sounding woman, which is really a computer, when our camera crew arrives.

CNET Newsmakers
June 23, 1997, Paul Saffo
Silicon Valley soothsayer
By Margie Wylie
Staff Writer, CNET NEWS.COM

It's hard to resist calling Paul Saffo a futurist when he plays the part so well. This afternoon, he's finishing up a speakerphone conversation with a pleasant-sounding woman, which is really a computer, when our camera crew arrives.

"Wildfire," he says.

"I'm here!" it answers.

"Where do you think I am?" he asks.

"You're in your office until 4:27 p.m. and then you're unavailable," it says.

Welcome to the Institute for the Future, the nearly 30-year-old nonprofit think tank and consulting firm for which Saffo is a director. At the institute, Saffo is one of many directors whose

job it is to forecast the impact of technology on the transportation, medicine, communications, and information industries, among others. But in Silicon Valley, he's one of the most credible of a raft of gurus who earn their living talking and writing about what comes next on the information technology roller-coaster ride.

Neither wildly optimistic, like techno-utopianist Nicholas Negroponte, nor darkly pessimistic, like neo-Luddite Kirkpatrick Sale, Saffo's outlook is reassuring. He sees a scary and bumpy ride to a future that he foresees being better than the present.

His moderate stance, combined with a propensity to speak in perfect sound bytes, has earned Saffo a gold-plated card in the Rolodexes of journalists. To be certain, he's come up with his own fair share of gimmicky buzzwords like "smartefacts," but unlike many of the self-proclaimed digerati, Saffo is more than a media mannequin.

Computer and communications companies contribute to the institute's multimillion-dollar research fund in order to partake of Saffo's findings. He's recently returned from an academic sabbatical at Stanford University,

where he researched the history of technology diffusion that has him convinced the world isn't changing any faster than it ever did. Instead, more changes are happening simultaneously than ever before. In an industry and an age that considers itself unprecedented, Saffo looks through the lens of the past to try to understand the future. And he has looked through that same lens to make some pointed observations about the present as well, including comparing today's computer industry with Columbian drug cartels.

CNET NEWS.COM indulged in a wide-ranging conversation with the technology forecaster in his Menlo Park offices. We touched on Internet commerce, the fear of the future in the computer industry, and the perversity of change.

We're living in a world that changes so rapidly. How can you possibly keep up, much less forecast technological shifts?
Saffo: The secret to my business is that nothing changes. Change is so slow and it repeats itself in funny ways.

In fact, the rate of technology diffusion today when it comes to specific technologies is no different than it was 100 years ago. It takes about 30 years for any new technology to fully diffuse into our lives as an ordinary fact of life. But we all feel the acceleration effect. And the reason is not because individual technologies are diffusing more quickly, but more things are diffusing at the same time and it is the impact among multiple technologies that creates the acceleration effect we feel. The more things are developing, the more elements there are, the faster change feels.

In Silicon Valley, most ideas take 20 years to become an overnight success. All the major devices in our lives--fax machines, copiers, color TV, cable TV, personal computers--all follow a roughly similar curve where it takes about ten years before diffusion begins, ten years or longer for diffusion to really spike upwards, and then a ten-year maturity period. The Internet is a classic instance of how slow things take to catch on. It started in the late '60s. In 1989 there's this uptick, and now we're in this boost phase.

You know, most people are like Mark Twain who observed, "I'm all for progress, it's change I object to." We're all terribly afraid of change; even the people in Silicon Valley who claim to make their living with it. Their idea of change is change everything else in order to preserve the things they cherish.

Microsoft is a company that is desperately resisting change. Its strategy is two-tiered. One is to desperately hang onto what it's got: making the operating system important even though we're moving into a world where the OS becomes steadily less important. At the same time, it is desperately looking for the next high-growth field that it'll make money on. So when the OS finally does start to decline, it will find a new field. It's targeted two areas: one is media and the other is services. Everything it's doing is going into that. It is a classic case of a change-hating company; it is desperately trying to retard change.

NEXT: The middleman myth and "disinter-remediation"


Age: 42

Claim to fame: Information Age sage

Latest work: Historical study of how new technologies disperse through society

Other themes: Sensors; millennial fever; nanotechnology

Writing: Silicon Valley Dreams, a collection of essays published in Japan

Degrees: B.A., Harvard; LL.B, Cambridge University; J.D., Stanford Law School

CNET Newsmakers
June 23, 1997, Paul Saffo
The middleman myth and "disinter-remediation"

The Internet was supposed to cut out the middleman, but you've said the opposite is happening. Can you explain that?
Conventional wisdom is a terrible thing. Conventional wisdom is generally wrong about everything involving the Internet. Four years ago, conventional wisdom was that the Internet was the death of advertising. Well, that was absurd! It was a full employment act for everybody in the advertising business.

The latest concept of conventional wisdom that is wrongheaded is called "disintermediation." The essence of disintermediation is somehow we're going to eliminate the middleman and go directly from buyer to seller, and everybody in the middle is going to be history. Real estate agents are gone, bankers are gone, etc. That is absolutely not what's happening. We are throwing out a lot of the middle players, but we're also putting in new middle players. The simple fact is that information technologies complexify the business environment. They create more options and opportunities and more niches for players. So what we're doing is creating this turbulent situation where we're throwing out some of the old players, we're putting in some of the new players, and at the end of the day we have more people in the middle, not less.

What's an example of that?
There's a historic example that demonstrates it. Once upon a time in the late 1950s, if you wanted an airline ticket, you went straight to the airline. You didn't go to a travel agent. You only went to travel agents for hotels and cruises and things. Along comes the mainframe computer and an executive at American Airlines has this really bright idea to build what would become the Sabre system. They needed someone to run this automated terminal system, so they gave it to the travel agents. Suddenly, you no longer went directly to the airlines; you went to the travel agent who was your intermediary to the airline. It was so effective and so good for American that the Justice Department said, "You have to let the other airlines on it." So Sabre, which was this unique advantage that American had, then became a commodity for all the airlines to use, but the travel agent was firmly intermediated in the mix.

So then another few years later, another executive at American Airlines said, "We need something that makes us unique. What can we do?" Minicomputers had arrived and they built the first frequent flyer program, AAdvantage, and once again mediated this new thing between the traveler and the airline for the airline's advantage. It didn't stop there. All the other airlines did the same thing and it became a commodity, still influencing purchase decisions. Then the credit card companies got in on the game using their computer systems, where suddenly you could buy a MasterCard or a Visa card that allowed you to get frequent flyer points.

Suddenly, what was a very simple relationship between traveler and airline was now a complex relationship, full of innumerable players. All of a sudden, people are doing things (thanks to computers) like arranging their flight schedules in ways calculated to get more miles, even though it was not the fastest route, and buying things on their credit cards for reasons that had nothing to do with travel, solely to get frequent flyer miles.

Information systems complexify. They create new niches, new opportunities for players. The more you put information technology in, the longer and more complicated the value chain becomes.

Yet I see the opposite. American Airlines sends me email every week telling me what their specials are if I buy directly from them.
Right. What's happened is that value chains have disappeared and in their place we now have value webs. The relationship between you and American Airlines, there are moments when it is direct (like if you're talking to the frequent flyer program), but there are other times where it's indirect and accelerated by your travel agent saying "Gee, American has a special. Do you want to use that?" They do promotions to the agent, the agent promotes to you, and it influences your buying decision.

To the question of how long can a value chain be, the answer is identical to something that Benoit Mandelbrot, the father of fractal geometry, wrote in 1978. There's an article he wrote titled, "How Long is the Coastline of England?" Well, the answer is a) it depends, and b) effectively infinite. You see, it depends because it's one length if you're on a plane flying around the island; it's longer yet if you're on a boat going in and out of little coves; it's longer still if you're in a car driving along the bluffs; and longest of all if you're an ant going up and down each one of the pebbles on the way around. It's effectively infinite. Well, how long is a value chain? A) it depends, and b) effectively infinite. Sometimes it's very direct, but other times it's very long and circuitous. And the more the computer is in the mix, the longer it becomes and the more circuitous.

People who think that there is disintermediation happening have got a very static view of things. We are throwing out middle players today, but we're going to put new middle players in. It's really a process of "disinter-remediation." The computer comes in, it changes the commercial environment, the middle players that do not adapt to that change are tossed away, and new players that can deal with the new environment come in and assume roles.

This isn't anything new. Information systems have been doing this for as long as anyone can tell. Exactly the same thing happened thanks to the printing press in the closing years of the 1400s. The printing press and both direct and indirect impacts led to the invention of modern mercantile capitalism between about 1490 and 1520. In fact, you can trace it to the publication of a little book called The Treviso Arithmetic in 1476. Published in a suburb of Venice, it's the earliest known dated book of shopkeeper's math, in vernacular, not Latin. And it probably wasn't the very first, but it's the earliest one that we know of. Think about what you need for capitalism. You need a class of numerate shopkeepers that can do the business math. This was a how-to book of business math. Out of this innocent little tome grew everything that we take for granted today.

What sort of implications does this myth of disintermediation have for businesses?
There are a couple of implications for businesses about disinter-remediation. One is do not blindly think about getting closer to your customer. Sometimes effective communication means getting farther away in the right sort of way.

If you're selling stereo systems, you want to make sure that the customer has your product brought to his or her attention at Circuit City. You need to find the right kinds of partners. So the nature of this business is picking partners right. Then second is accepting the fact that partner relationships are going to be as volatile as the marketplace.

Once upon a time, it was enough to pick the right partner. Now, you have to pick the right partner in time and effect the partnership quickly enough so you can take advantage of the market, and also be prepared to break the partnership on friendly terms when it no longer makes sense. You don't want to be mean to partners, but everybody walks in saying, "OK, here's this moment in time. We have an opportunity; we'll go after it. The moment the opportunity disappears, we go our separate ways, and we may come back together again when conditions change."

The other element of this--I think this is very good news for entrepreneurs--information technologies scale badly. When an information technology first arrives, it's easier for a small organization to use it than a large organization. Because information technologies scale badly, it means the advantage adheres to the small players. It's an unlevel playing field and the small player has the advantage over the big corporation. Big corporations have money and resources, but they can't use new technologies as effectively as small bands of entrepreneurs with crazy ideas, shoestring budgets, and not a lot of adult supervision.

Is there a Web example of that?
There's a classic Web example, and that is vs. the established bookstores. [ founder] Jeff Bezos had a fabulous inspiration. He was not a book man; he was a computer guy on Wall Street and said, "This is a fabulous vehicle for selling things. What would people like to buy and sell?" He looked at the demographics and said, "This is a no-brainer: books." He looked at the book distribution and said, "Really terrible distribution system. Crown Books is appalling. All you get is the main stuff. Can't get all those other books. I've got publishers who want to sell small run books to the public. Let's connect and create a more efficient marketplace...and then do creative things."

The idea on is that if Nicholas Negroponte has a new book out, sells it, and on my Web page I can say, "Here are my five favorite books right now. One is Nicholas Negroponte's Being Digital," and at the bottom there's a button that says "buy it." You click that button, you buy Negroponte's book through, and I get six percent of what you pay. That's a virtuous cycle. So it's a whole different way of buying and selling stuff. got in there ahead of everyone else because it had the vision and also could scale with the technology. Now the big boys are coming: Barnes and Noble comes in, has a nice Web page, but did the classic big-company stuff. Really dirty trick with the lawsuit it did right around the public offering. To me, that crossed the line. That was a bad business practice...and it's smarter than that. That was just mean. But it got back to business. Now it's doing a Web page and chasing Will it catch up? Who knows? There's space for more than one bookstore in cyberspace.

NEXT: Free market determinism and other follies

CNET Newsmakers
June 23, 1997, Paul Saffo
Free market determinism and other follies

You've spoken and written about businesses going from the machine models to organic models. Is that what's going on with the Web?
Absolutely. It's gotten a big boost because of information systems, but it's also an independent trend that you see in a lot of different areas. That is that we are abandoning industrial- and machine-age metaphors for business in favor of biologically based metaphors. Instead of talking about organizations like machines, we're going to talk about organizations as organisms. I think ultimately we're going to borrow very heavily from evolutionary biology in general, in the field of symbiosis in particular, as a means for understanding the nature of business as a biological construct rather than a mechanical construct.

My guess is that 100 years from now, when the economic history of the 20th century is written, we will look back and say that John Maynard Keynes was not the most important economist of the century. It was Joseph Schumpeter, because Schumpeter really captured the important of the role of technology and innovation in building the economic cycle.

He missed a few things. He didn't understand the dynamism of the entrepreneurial process, but one phrase that Schumpeter is famous for is he imagined waves of creative destruction that would come through established industries and an industry would arise, become established, have its moment in the sun, and then something newer would come along and you would have this creative destruction as you tore apart the old industry and established a new one. That is exactly what happens in technology markets. It's the process of the S-curve.

That sounds perilously close to free-market determinism.
No, it's not. That's the really dangerous thing about this kind of shift. There already is an appalling amount of nonsense being written about biology as a metaphor for business, and there's a real problem with biology as a metaphor for business. If you go to experts in symbiosis and you say, "Teach me your field. Help me understand the nature of relationships. Give me a vocabulary I can apply to business, communalism, parasitism, symbiosis--give me the words," and then you really press the biologist, it turns out that even the biologists who are experts in symbiosis can't explain it. Biologists do not yet understand symbiosis, expect that it exists. Now if you press the scientists, real quickly everything ends up being like Kipling's Just So stories.

So what we have is one science--biology--being raided by disparate economic specialists to explain this other field of economics...and we're not even sure if the emperor has any clothes in the biology arena. This is all made worse [with the] whole bunch of middle-aged white boys out there who feel compelled to write these moronic, prescriptive management books. They're starting on this biology thing. You know, we've burned out reengineering; the next one is intellectual capital; the one after that is going to be probably more than one book on biology. Because it's such a fuzzy field, they're going to turn around, build a reasoned argument, say this is why biology matters, dot, dot, dot...and that's why it should be Newt Gingrich-style capitalism or it should be nouvelle communism. They will end up taking one leap of faith too many, and because the field is new and intriguing and exciting, it will be very hard for people to critically judge whether that's a reasonable assumption or the person is completely off-the-rails crazy.

It seems to me that Silicon Valley is peddling the belief that "you can't stop the march of technology" more and more with a sort of blind faith in capitalism. Do you see that?
What's really happening right now, in terms of the global markets and capitalism, is that yes, capitalism has become a religion, but it has become a religion at a point where the United States no longer has enemies, a clear enemy. We are a culture driven and defined by who we oppose, and if there is no enemy out there we will find one or invent one. We desperately need the other in order to find our own personalities.

Capitalism has become a religion, but we're discovering it is a religion with several different sects. There's the entrepreneurial capitalism symbolized by Silicon Valley. There is a Confucian capitalism that instead of emphasizing the individual emphasizes the community, typified by Japan but now being reinvented in places like China and Singapore. You go to Shanghai and you see a different variation of capitalism. This is very troubling because the most vicious wars have been fought not by people of opposing belief systems, but rather by people who have different variants of the same belief. The Christians were vastly more cruel to each other than they were to Muslims or vice versa. And while we feared communism, the real danger is different dialects of capitalism duking it out.

Now I think shooting wars are unlikely, but economic warfare in the next century, information warfare, could be very vicious indeed.

We know that a pure free market model often leads to suboptimal technology consequences. If it's a pure marketplace setting, we often end up with suboptimal standards. You get "lock-in" around one thing. We've got lock-in around Windows.

Ted Nelson put it very nicely when he paraphrased Lord Acton of a hundred years ago and remarked, "You know, all power corrupts and obsolete power corrupts obsoletely." DOS was basically a model of a time-sharing system, except there was no computer remotely. It was a time-sharing system in a box. There were other models competing for our attention at the time. THAT ONE, because of the vagaries of free market forces, got established. As Brian Arthur at the Santa Fe Institute likes to say, "Them that has gets." Microsoft got into that slot, and because of the business genius of Bill Gates kept growing bigger and bigger and capturing more of the market and subsuming and killing off superior models that could work.

That was a classic case where free market forces led us down a random walk to an inferior technology that we're now stuck with. It's not just the command-line stuff. We're stuck with this WIMP interface: windows, icons, menus, pull-down stuff--it is a terrible model for the world we're going into. It was fine when we were just processing information and the like, but now where our computers have become windows on a larger information world, and increasingly it is a world where machines are talking to machines on behalf of people, a Windows interface is a terrible interface and we desperately need something new, but as long as Microsoft can control the market and charge monopoly rents, that won't happen.

You can hear the first whisper: The rising gales of creative destruction are coming. The longer people resist the change, the greater the change and discontinuity will be when it finally comes. If Gates is still in charge of Microsoft when it finally hits, you can bet that Microsoft will be part of the change and will survive to the next phase.

What is the next phase?
We have a digital fixation at the moment. We're completely obsessed by digital technologies, and there isn't a problem on the planet that they can't solve.

In the next five years, it will become very clear that the single largest growth area of electronics is going to be hybrid analog-digital electronics. In the long run, 50 to 75 years from now, we may look back and recognize that digital technology was just a brief interval between two analog orders.

About every ten years, a new foundational technology arrives that pretty much sets the stage for all the innovation to occur. Around 1980, that technology was the microprocessor and it ushered in a decade-long processing revolution where we were completely and utterly preoccupied with processing everything we'd get our hands on. The symbol of that decade was the PC.

The '90s are shaped by a different technology: the advent of cheap lasers. It was laser diodes that made possible all the bandwidth down fiber-optic phone lines and all the storage of CD-ROM. In contrast to the '80s (the processing decade), the '90s is an access decade where the devices that matter are defined not by what they process but by what they connect us to.

We're right in the middle with that, but now we can see the technology that's going to shape the next decade, and that is cheap sensors: eyes, ears, and sensory organs that we're going to hang off our computers and our networks. Basically, what we're going to do is give our computers and networks the ability to become aware of the analog world that surrounds them, which today they have no concept of. That is what is going to set the stage for the resurgence of analog technology.

Sensors are, for example, MEMS (micro-electromechanical systems), semiconductor technology used to create an analog sensor device. This sensor by definition is an analog device, and what you do is collect analog information out of the physical world, at some point do an analog-to-digital conversion, and then process it with a digital engine. But we know there are some classes of problems that are better solved in analog space than digital space. Vision recognition stuff--a lot of that is better solved in analog space.

People marvel today at how central computers are in our lives. They are not compared with what they will be 10, 20 years from now. Saying computers are central to our lives today is like saying in 1965 that computers are central to our lives because they processed our Social Security payments and our payrolls. Computers are going to become vastly more important than they are today in ways we won't notice. Also, if you want to measure the importance of a computer today, the importance of a machine today is an inverse function to its visibility. If you can see the device, it is not important at all.

I could come into your office, take a sledgehammer, and smash your desktop machine. You might whimper a bit, but you'd find a pad of paper and get back to work. You'd be back to normal in five minutes. But it is the machines that you wouldn't even guess existed--their disappearance will utterly ruin your life. The computers that run the power grid, the ESS7 switch down at the local phone company: shut them down and you are hamstrung.

The little machine on our desk is the tip of a large digital iceberg and the least important part. It is machines that we never even imagined existed that we are utterly dependent upon. It's that whole complex that is growing VASTLY more rapidly than computers at the desktop. In ten years, we'll look back and say, "I can't believe people in the '90s thought computers were central to their lives compared to today."

So will Moore's Law hold up?
Moore's Law still applies, but you can start seeing the end of Moore's Law. The moment you start doing analog technologies, Moore's Law starts getting really unpredictable.

Even before we hit the wall with Moore's Law, we're going to discover that there's a demand for different kinds of architectures. Von Neumann architectures are looking real shaky right now. They're on the skids. The Von Neumann architecture, classic I/O architecture we live with is going to get a challenge from new architectures. A big part of the cause is cheap sensors. Let me give you an advanced example. It's just something that probably won't happen for 15 or 20 years.

Scientists at UCLA are working on a smart skin for airplanes. Basically, the leading edge of the wing has got to raise very small sensors and small tablike ailerons. These are 1/3,000th of an inch across. They exploit, by the way, the quality of silicon we all forget. Silicon has a tensile strength greater than steel, a strength-to-weight ratio better than aluminum, and a thermal coefficient near zero.

So on a leading edge of wing, you use this to dampen down microturbulence. The leading edge has got tens of thousands of these things along it, a sensor and a vector that pops up and down. Now you say "OK, we need a computer to run it." Well, you run fiber-optic cable to a point behind the pilot's seat and you take the fastest computer in the world. In fact, forget the fastest computer in the world: I just magically deliver a computer with zero wait speed, no latency. It is infinitely fast. It turns out that even with an infinitely fast computer, the speed of light gets in the way, because just the amount of time it takes to send the signal from the sensor down to that computer and back to the vector is too slow.

So what you need is a way radical distribution architecture where for each sensor and a vector there's a triad: there's a sensor and a vector and a little processor. All those devices are going to have to talk to each other in some radically distributed process.

NEXT: The pace and perversity of change

CNET Newsmakers
June 23, 1997, Paul Saffo
The pace and perversity of change

People do feel overwhelmed by rate of change in technology. What lessons can we learn from the past about technology to help us into the future?
Well, one thing is an old folk saying that I learned as a small child from a cowboy. He said, "Never mistake a clear view for a short distance." It's really true. We're fascinated by change, even though we fear it and we're fascinated by innovation. Our fascination leads us to a very peculiar form of double vision.

We tend to overestimate the short-term impacts of some threatened change, but we underestimate the long-term implications. Either our hopes or our fears lead us to overestimate the short term, and then when reality fails to conform to our inflated expectations our disappointment leads us to underestimate the long term.

We did it with PCs. Back in 1980, everybody was saying "PCs are going to change our lives; everybody's going to have them; we're going to live in electronic cottages; we're going to move out to the mountains." It didn't happen, didn't happen, and didn't happen. Even by 1985, people were astonished that not every home had a PC. At that point, they said "It will never happen." And they walked away.

Then one company said, "Gee, what if we created a specialized personal computer that just did one thing everybody liked: entertainment?" That company was called Nintendo, and the rest, as they say, is history. Now the impacts--10, 20 years later--the impacts are actually much greater than anyone imagined.

We take the World Wide Web for granted today, but if you had said to someone in 1980 that researchers and teenagers--14-year-olds in 1997--would be dialing into computers in Singapore and sucking data off and surfing this Web, they would have thought that you were a nutcase. "No way that couldn't possibly happen!" Even as they were saying that every home in the United States would have a personal computer by 1985.

So one lesson is never mistake a clear view for a short distance and understand that even the most expected of futures tends to arrive late and in utterly unexpected ways.

If the future is so hard to predict, always comes late, and in ways you can't foretell, how do you keep your job?
I am not a futurist, I'm a forecaster. As a forecaster, what I really do is nothing more than applied common sense. We all forecast every day of our lives. Get up in the morning, look out the window, and you're making an estimate of what the weather will be like. Buy a house: Choose between a fixed mortgage and an adjustable mortgage. You have just made a synthesized forecast of what the U.S. economy is going to do over the next 20 years--we all do it. The only difference is I happen to do it as a full-time job.

My motto is "strong opinions, weakly held." Information is always incomplete. So what I try and do is come to a conclusion based on the information I have and then systematically try to tear down my conclusions and look for things that show I'm wrong. Most people do the opposite: They spend forever building their conclusion and then they hold it and cherish it in the face of conflicting evidence.

Most people look for things that fit into categories. I look for things that don't fit. My poor wife has to suffer through this. We were driving up the coast on Highway 101 and there was a sign that said "End emergency callboxes." It just bugged me and it bugged me enough that I turned around, went back, stopped in front of the sign, and shot a photograph. I didn't know why it was important. I kept noodling over it.

Then by the end of the weekend I knew why it had bugged me. If you're driving along, you assume that you're in a zone of no communications unless someone puts up a sign and says you can communicate here: phone. Somehow, we had suddenly entered a world where Caltrans (the California transportation department) felt compelled to put up a sign saying in effect, "Warning: You're leaving the zone of communications. You're welcome to drive further north, but don't blame us if you can't find a phone." The world had just flipped.

Why does the computer industry not learn from the past?
It's the same reason that they call their customers "users." That the computer industry drips with scorn for everything except themselves. As has been said, there are only two industries on this planet that would use such a sneering term like "user" for its customers: the computer industry and the drug cartels. They both have an equal lack of respect for the poor, miserable souls who have to use their crummy products. It is the nature of engineers to think that they invented everything and not have a sense of history.

Are you optimistic about the future?
I'm a professional bystander and by nature a professional agnostic. But history of technology gives one a sound basis to be a short-term pessimist but a long-term optimist.

If you look at the last couple of centuries, things have either been great but everybody said, "It's about to go down the tubes," or it's been bad and they say, "It's about to get worse." But if you look at the sweep of history, over the long run things have actually gotten pretty steadily better. I, for one, would not have liked to live before the invention of modern anaesthetic dentistry.

When we look at the future we tend to amplify it. If you tend to be an optimist, you tend to get wildly optimistic that there is a new nirvana. But if you are even mildly pessimistic, you amplify it that way and you imagine this unimaginable hell of our own creation. Well, neither will happen unless we really screw up. What we will do is what we've always done, which is muddle through somewhere in the middle and then technology will be this mixed bag of pleasant surprises and unpleasant consequences, but it will tend to mush on through and we'll do what we always did. So, in short, things have been going to hell for as long as anyone can remember, but in the long term they've actually gotten better. I think there is sound reason to believe that the same will happen now, that we have plenty of short-term crises and surprises and bumps, but things will get better.