Free market determinism and other follies You've spoken and written about businesses going from the machine
models to organic models. Is that what's going on with the Web?
Absolutely. It's gotten a big boost because of information systems, but
it's also an independent trend that you see in a lot of different areas.
That is that we are abandoning industrial- and machine-age metaphors for
business in favor of biologically based metaphors. Instead of talking about
organizations like machines, we're going to talk about organizations as
organisms. I think ultimately we're going to borrow very heavily from
evolutionary biology in general, in the field of symbiosis in particular,
as a means for understanding the nature of business as a biological
construct rather than a mechanical construct.
My guess is that 100 years from now, when the economic history of the 20th
century is written, we will look back and say that John Maynard Keynes was
not the most important economist of the century. It was Joseph
Schumpeter, because Schumpeter really captured the important of the role of
technology and innovation in building the economic cycle.
He missed a few things. He didn't understand the dynamism of the
entrepreneurial process, but one phrase that Schumpeter is famous for is he
imagined waves of creative destruction that would come through established
industries and an industry would arise, become established, have its moment
in the sun, and then something newer would come along and you would have
this creative destruction as you tore apart the old industry and
established a new one. That is exactly what happens in technology
markets. It's the process of the S-curve.
That sounds perilously close to free-market determinism.
No, it's not. That's the really dangerous thing about this kind of shift.
There already is an appalling amount of nonsense being written about
biology as a metaphor for business, and there's a real problem with biology
as a metaphor for business. If you go to experts in symbiosis and you say,
"Teach me your field. Help me understand the nature of relationships. Give
me a vocabulary I can apply to business, communalism, parasitism,
symbiosis--give me the words," and then you really press the biologist, it
turns out that even the biologists who are experts in symbiosis can't
explain it. Biologists do not yet understand symbiosis, expect that it
exists. Now if you press the scientists, real quickly everything ends up
being like Kipling's Just So stories.
So what we have is one science--biology--being raided by disparate economic
specialists to explain this other field of economics...and we're not even
sure if the emperor has any clothes in the biology arena. This is all made
worse [with the] whole bunch of middle-aged white boys out there
who feel compelled to write these moronic, prescriptive management books.
They're starting on this biology thing. You know, we've burned out
reengineering; the next one is intellectual capital; the one after that is
going to be probably more than one book on biology. Because it's such a
fuzzy field, they're going to turn around, build a reasoned argument, say
this is why biology matters, dot, dot, dot...and that's why it should be
Newt Gingrich-style capitalism or it should be nouvelle communism. They
will end up taking one leap of faith too many, and because the field is new
and intriguing and exciting, it will be very hard for people to critically
judge whether that's a reasonable assumption or the person is completely
off-the-rails crazy.
It seems to me that Silicon Valley is peddling the belief that "you
can't stop the march of technology" more and more with a sort of blind
faith in capitalism. Do you see that?
What's really happening right now, in terms of the global markets and
capitalism, is that yes, capitalism has become a religion, but it has
become a religion at a point where the United States no longer has enemies,
a clear enemy. We are a culture driven and defined by who we oppose, and if
there is no enemy out there we will find one or invent one. We desperately
need the other in order to find our own personalities.
Capitalism has become a religion, but we're discovering it is a religion
with several different sects. There's the entrepreneurial capitalism
symbolized by Silicon Valley. There is a Confucian capitalism that instead
of emphasizing the individual emphasizes the community, typified by Japan
but now being reinvented in places like China and Singapore. You go to
Shanghai and you see a different variation of capitalism. This is very
troubling because the most vicious wars have been fought not by people of
opposing belief systems, but rather by people who have different variants
of the same belief. The Christians were vastly more cruel to each other
than they were to Muslims or vice versa. And while we feared communism, the
real danger is different dialects of capitalism duking it out.
Now I think shooting wars are unlikely, but economic warfare in the next
century, information warfare, could be very vicious indeed.
We know that a pure free market model often leads to suboptimal technology
consequences. If it's a pure marketplace setting, we often end up with
suboptimal standards. You get "lock-in" around one thing. We've got lock-in
around Windows.
Ted Nelson put it very nicely when he paraphrased Lord Acton of a hundred
years ago and remarked, "You know, all power corrupts and obsolete power
corrupts obsoletely." DOS was basically a model of a time-sharing system,
except there was no computer remotely. It was a time-sharing system in a
box. There were other models competing for our attention at the time. THAT
ONE, because of the vagaries of free market forces, got established. As
Brian Arthur at the Santa Fe Institute likes to say, "Them that has gets."
Microsoft got into that slot, and because of the business genius of Bill Gates kept growing bigger and
bigger and capturing more of the market and subsuming and killing off
superior models that could work.
That was a classic case where free market forces led us down a random walk
to an inferior technology that we're now stuck with. It's not just the
command-line stuff. We're stuck with this WIMP interface: windows, icons,
menus, pull-down stuff--it is a terrible model for the world we're going
into. It was fine when we were just processing information and the like,
but now where our computers have become windows on a larger information
world, and increasingly it is a world where machines are talking to
machines on behalf of people, a Windows interface is a terrible interface
and we desperately need something new, but as long as Microsoft can control
the market and charge monopoly rents, that won't happen.
You can hear the first whisper: The rising gales of creative destruction
are coming. The longer people resist the change, the greater the change and
discontinuity will be when it finally comes. If Gates is still in charge of
Microsoft when it finally hits, you can bet that Microsoft will be part of
the change and will survive to the next phase.
What is the next phase?
We have a digital fixation at the moment. We're completely obsessed by
digital technologies, and there isn't a problem on the planet that they
can't solve.
In the next five years, it will become very clear that the single largest
growth area of electronics is going to be hybrid analog-digital
electronics. In the long run, 50 to 75 years from now, we may look back and
recognize that digital technology was just a brief interval between two
analog orders.
About every ten years, a new foundational technology arrives that pretty
much sets the stage for all the innovation to occur. Around 1980, that
technology was the microprocessor and it ushered in a decade-long
processing revolution where we were completely and utterly preoccupied with
processing everything we'd get our hands on. The symbol of that decade was
the PC.
The '90s are shaped by a different technology: the advent of cheap lasers.
It was laser diodes that made possible all the bandwidth down fiber-optic
phone lines and all the storage of CD-ROM. In contrast to the '80s (the
processing decade), the '90s is an access decade where the devices that
matter are defined not by what they process but by what they connect us to.
We're right in the middle with that, but now we can see the technology
that's going to shape the next decade, and that is cheap sensors: eyes,
ears, and sensory organs that we're going to hang off our computers and our
networks. Basically, what we're going to do is give our computers and
networks the ability to become aware of the analog world that surrounds
them, which today they have no concept of. That is what is going to set the
stage for the resurgence of analog technology.
Sensors are, for example, MEMS (micro-electromechanical systems),
semiconductor technology used to create an analog sensor device. This
sensor by definition is an analog device, and what you do is collect analog
information out of the physical world, at some point do an
analog-to-digital conversion, and then process it with a digital engine.
But we know there are some classes of problems that are better solved in
analog space than digital space. Vision recognition stuff--a lot of that is
better solved in analog space.
People marvel today at how central computers are in our lives. They are not
compared with what they will be 10, 20 years from now. Saying computers are
central to our lives today is like saying in 1965 that computers are
central to our lives because they processed our Social Security payments
and our payrolls. Computers are going to become vastly more
important than they are today in ways we won't notice. Also, if you want to
measure the importance of a computer today, the importance of a machine
today is an inverse function to its visibility. If you can see the device,
it is not important at all.
I could come into your office, take a sledgehammer, and smash your desktop
machine. You might whimper a bit, but you'd find a pad of paper and get
back to work. You'd be back to normal in five minutes. But it is the
machines that you wouldn't even guess existed--their disappearance will
utterly ruin your life. The computers that run the power grid, the ESS7
switch down at the local phone company: shut them down and you are
hamstrung.
The little machine on our desk is the tip of a large digital iceberg and
the least important part. It is machines that we never even imagined
existed that we are utterly dependent upon. It's that whole complex that is
growing VASTLY more rapidly than computers at the desktop. In ten years,
we'll look back and say, "I can't believe people in the '90s thought
computers were central to their lives compared to today."
So will Moore's Law hold up?
Moore's Law still applies, but you can start seeing the end of
Moore's Law. The moment you start doing analog technologies, Moore's Law
starts getting really unpredictable.
Even before we hit the wall with Moore's Law, we're going to discover that
there's a demand for different kinds of architectures. Von Neumann
architectures are looking real shaky right now. They're on the skids. The
Von Neumann architecture, classic I/O architecture we live with is going to
get a challenge from new architectures. A big part of the cause is cheap
sensors. Let me give you an advanced example. It's just something that
probably won't happen for 15 or 20 years.
Scientists at UCLA are working on a smart skin for airplanes. Basically,
the leading edge of the wing has got to raise very small sensors and small
tablike ailerons. These are 1/3,000th of an inch across. They exploit, by
the way, the quality of silicon we all forget. Silicon has a tensile
strength greater than steel, a strength-to-weight ratio better than
aluminum, and a thermal coefficient near zero.
So on a leading edge of wing, you use this to dampen down microturbulence.
The leading edge has got tens of thousands of these things along it, a
sensor and a vector that pops up and down. Now you say "OK, we need a
computer to run it." Well, you run fiber-optic cable to a point behind the
pilot's seat and you take the fastest computer in the world. In fact,
forget the fastest computer in the world: I just magically deliver a
computer with zero wait speed, no latency. It is infinitely fast. It turns
out that even with an infinitely fast computer, the speed of light gets in
the way, because just the amount of time it takes to send the signal from
the sensor down to that computer and back to the vector is too slow.
So what you need is a way radical distribution architecture where for each
sensor and a vector there's a triad: there's a sensor and a vector and a
little processor. All those devices are going to have to talk to each other
in some radically distributed process.
NEXT: The pace and perversity of change