Dave Winer shares his recollections about developing software for the original Macintosh computer and offers his perspective on what Apple could have done better.
This article is part of a CNET special report on the 30th anniversary of the Macintosh, looking at the beginnings of Apple's landmark machine and its impact over the last three decades.
January 24, 1984. It was a wonderful day, the culmination of a couple of years work by my startup company, Living Videotext. Our big public announcement would come on the same day the Mac rolled out.
I had hired my brother and future sister-in-law to help with the development. We had a Macintosh in our office. It was a huge secret at the time, but we showed it to our best friends anyway, swearing them to secrecy (now it can be told). I remember my first look at a Macintosh, I knew I was going in to see it, but nothing could prepare me for the surprise, the feeling you get when you look at a baby or a puppy or kitten -- this thing is cute and gorgeous and new and filled with potential. Most important it spoke directly to me and said, "I am the perfect place to put your software."
Up until then, I had been working on Unix systems, and then the Apple II. I loved the little computers, because they were all mine to use as I wished. On the earlier, bigger machines, you had to share them with other users and programmers. The machine felt far away, but with the Apple II, it was right there.
The Mac had that feeling too, but it was also elegant and simple and brilliant. The type of people who would love this machine would also love what I was making, an idea processor.
ThinkTank, which I had been developing until then on the Apple II, was a tool for organizing your thoughts on a computer screen. You could create an outline, then indent, move an item up a list, or out a level. Flesh out the details, and quickly record a top-level idea you had overlooked.
Our tool was designed to add what would eventually be called "agility" to your thought process. The ability to quickly revise as you learned more about the problem. Outlines on paper were rigid, but outlines on a computer screen -- they could fly!
The problem was, you weren't inside the outline on an Apple II or an IBM PC, you were above it. You couldn't move an item with your hands, you had to trick a "cursor" into doing what you wanted. The big feature the Mac had that was new then was the mouse. You could directly manipulate your ideas that way.
It's no coincidence that the earliest experimenter in the area we were commercializing, Doug Engelbart, was also the inventor of the mouse. We had been hearing about mice, I had even used one in a demo at Xerox PARC, but now, with the Mac -- I had one on my desk. I loved the Mac at first sight, but the foundation of our long-term relationship was the mouse (30 years later, I'm writing this story in my outliner Fargo on a Mac, with a mouse).
The rollout on January 24th was like a college graduation ceremony. There were the frat boys, the insiders, the football players, and developers played a role too. We praised their product, their achievement, and they showed off our work. Apple took a serious stake in the success of software on their platform. They also had strong opinions about how our software should work, which in hindsight were almost all good ideas. The idea of user interface standards were at the time controversial. Today, you'll get no argument from me. It's better to have one way to do things, than have two or more, no matter how much better the new ones are.
That day, I was on a panel of developers, talking to the press about the new machine. We were all gushing, all excited to be there. I still get goosebumps thinking about it today.
My startup, when the Mac came out, made most of its money off IBM PC software. By 1986, with the help of Bill Campbell at Apple, we got Macs for every employee at our growing company and our board of directors, and the die was cast. We became a Mac software company. The Mac was berry berry good to me (a dated reference to a fictional character from the 1980s on SNL).
But the Mac, while it was a brilliant vision, and its gestalt so lovely, was in its first incarnation, a flawed product. It didn't have enough memory for a machine with so much graphic potential. The screen was tiny, as were the floppy disks. The product came at a time when personal computers were getting hard drives, but the Mac had no ability to expand. The mouse was wonderful, but sometimes you need cursor keys. The first Mac didn't have them. The Mac was a statement, that's for sure -- but it wasn't a very usable statement, at least in its first incarnation.
Now that was all fixed, in relatively short order. By 1986 the Mac had arrow keys, a bigger screen, more memory, and most important an expansion capability. Then it was supremely usable, and kicked butt in the market.
But what about the long-term historical perspective of the Mac. There, the verdict is not so good. Most of the hype about the 30th anniversary has left out this part. True, all personal computers to follow were basically copies of the Mac, some good, some not so good. The ideas were very valid, and have stood the test of time. But it's the missed opportunities caused by the Mac's insistence on being right about everything, even when the Mac was wrong, that caused the fractures in the marketplace that are still visible today in the UI of software, and in the confusion of non-expert users.
Try this out sometime with a friend who is a casual computer user and has a WordPress blog. Ask them to choose a command from a menu, and see which one they choose. There are three menu systems on the screen at the same time! One for the operating system, one for the browser, and one for the web app. This is not simple and not easy to use, and is the result of Apple's proprietary networking, way back in the 80s, that forced the innovation in networking, that was the manifest destiny of personal computers, to route around the closed-off networking protocols of the Macintosh.
Had Apple, instead of keeping the right to create networking software for itself almost exclusively and producing confusing APIs, taken the opposite approach -- that their APIs were not proprietary and could be cloned by other manufacturers freely and built on by software developers, again, freely -- the networks we use today would work in vastly different and IMHO, much better ways. We'd also be much further along. In many ways, the networking user interfaces we use today are inferior to the ones we used on the Mac in the '80s.
There were a handful of companies that had mastered the Apple networking stack at the time. Reese Jones' Farallon, Andrew Singer's Think Technologies, a pair of developers who made an email app that was bought by Microsoft and became their mail product (the founder of that company, Steve Ullman, was a real visionary) and Don Brown at CE Software with QuickMail. I tried to work with all of them, but it wasn't enough of a critical mass to make a market. I desperately wanted networking for MORE, my outliner. I felt it wasn't complete without it. Had Apple been less restrictive, my career would have been much more interesting. We have the networking today that I wanted then. But it was a long time coming.
The Web should have happened on the Mac. We had the best software, the best developers, the best platform, no 640K limit (don't laugh, software on the PC was limited in how far it could grow). We had it all, but the Apple culture wouldn't let us use it.
I love the Mac. I love what it did for me. It gave me a lot of freedom I wouldn't have gotten any other way. However, it stopped short of where it could have gone, and in doing so, I hope serves as a lesson for future generations of technologists. When someone argues for reserving the best stuff for your employees, tell them to stop screwing with your success.
As the famous Apple evangelist Guy Kawasaki said, "Let a thousand flowers bloom." Love the developers and the random chaos the bring to you, and be ready for the love that will flow around your platform. It's the only way that works.