The Mac Is the Message
History of Macintosh, a computer that `made a dent in the universe'
`INSANELY Great: The Life and Times of Macintosh, the Computer That Changed Everything'' is Steven Levy's ode to his favorite computer, the Apple Macintosh. It's the computer that changed the lives of millions, brought the power of computers to nonwireheads for the first time, created desktop publishing, and, according to Levy, made a small dent in the fabric of the universe.
Such accomplishments are not minor feats. Fortunately, Levy, an experienced chronicler of hackers and the wonders of computer science, is up to the task.
While ``Insanely Great'' isn't the first book to charter the tumultuous birth and uncertain childhood of Macintosh, it's one of the fastest and most enjoyable to read. Others in this well-trodden genera include the biography of Steve Jobs by computer journalist Jeffrey Young: ``Steve Jobs: The Journey is the Reward'' (Scott, Foresman and Company, 1988) and, to a lesser extent, former ``evangelist'' Guy Kawasaki's two books ``The Macintosh Way: The Art of Guerrilla Management'' (Scott, Foresman and Company, 1989) and ``Selling the Dream'' (HarperCollins, 1992).
Unlike these others, Levy's work is the first to place Macintosh in its true historical context: not merely as a successful machine that guaranteed a decade of profits for one of the world's largest computer companies, but as a catalyst for a revolution that changed the way people think about computers, information, and even themselves.
Levy traces the roots of the Macintosh revolution back to Vannevar Bush's 1945 essay in the Atlantic, ``As We May Think,'' in which Bush - decades ahead of his time - envisioned personal computers, hypertext, and a worldwide network of interlinked data banks. Levy then follows Doug Engelbart, an engineer who was inspired by Bush in 1950 to drop his career, return to school to earn a doctorate, and eventually invent the concept of ``windows,'' a technique for using a computer screen to display several programs at the same time.
By the third chapter, Levy is up to PARC - The Palo Alto Research Center (an arm of the Xerox Corporation) - that created the world's first personal computer in 1973, complete with a mouse and its own window system. But the PARC system languished while IBM prepared and finally introduced, eight years later, its barbaric, difficult-to-use, and immensely popular PC.
The difference between Macintosh and these other accomplishments from the past was deployment. ``Real artists ship,'' said Steve Jobs, Apple's co-founder and leader of the Macintosh team. No matter how good a computer is, unless it gets out - is shipped - unless it hits the street, it doesn't change any lives. It doesn't make a dent in the universe. In a very real sense, it doesn't really matter at all.
Levy does a better job than most other writers at conveying the turbulent manner in which Jobs combined his ``insanely'' great charisma and vision with an ``insane'' management style. At one point in the development of Macintosh, Levy writes, Jobs was furious that the prototype Macintosh took 30 seconds to start up after a person turned it on.
Finally, Jobs pinpointed the source of delay with the system's programmer, Larry Kenyon. But Kenyon, Levy writes, couldn't figure out any way to make the machine start up faster. Jobs was unconvinced:
``Even if it took you three days to make it a single second faster, it would be worth it,'' Jobs said. ``If ten million people use the computer, in one year alone, that's about 360 million turn-ons. How many lifetimes does 360 million seconds equal? Fifty? Would you take three days to save fifty people's lives?'' Kenyon eventually shaved three seconds off the Mac's start-up time, writes Levy, ``sparing a hundred extra souls from the Reaper.''
So far, so good. But then, a few weeks later, Levy writes, Jobs became obsessed with the functional design lines of French food processors, and spent two weeks out of his busy schedule during the height of Macintosh development to look at them in Bay-area show rooms. Just how many souls were lost to that exercise?
The paradox of Macintosh, as ``Insanely Great'' makes abundantly clear, is that an easy-to-use computer is incredibly difficult to build. In the final analysis, computers don't save time and work; they simply move it around, from the computer's user to the keyboard of the programmer.
The power of modern computers is that they let a single team of programmers solve an entire set of problems once; then, simply by copying the software, millions of people can solve those problems again and again without great effort. Nevertheless, Macintosh was nearly a failure, because it was so difficult for people to master the art of writing those easy-to-use programs.
It took a group of programmers working in Seattle, under the name Aldus, to write the application program that would be the machine's salvation. That program was PageMaker, the world's first Desktop Publishing application. Desktop Publishing forever changed the way magazines and newspapers around the world are produced. Even this newspaper was produced on a Macintosh with a desktop-publishing application.
Sadly, one of the problems with ``Insanely Great'' is technical accuracy. Just as 360 million seconds do not make up 50 lifetimes (they barely account for 11 years), in many places Levy blithely repeats technical details that are simply wrong.
Although these errors won't detract from the book for all but the most nit-picking nerds, their presence is annoying. These mistakes would have been a lot easier to track down and fix if Levy included an index. No book that purports to be a history, let alone a history of ``the computer that changed everything,'' should be without one.