The Computer History Museum is around the corner from Google headquarters in Mountain View, California. Actually it is just about the only thing that can be said to be around the corner from Google headquarters, which takes up almost the entirety of an enormous office park sandwiched between a ten-lane freeway and the San Francisco Bay. The museum’s building is unassuming, like most of the web of clean, quiet glass-and-concrete structures that stretches north from San Jose and from which much of the American empire is administered. Its main permanent attraction is a roughly chronological sequence of twenty small exhibits, starting with the abacus and ending with the internet. The whole thing can be absorbed in tolerable detail in around four hours, and half that time is enough for a general overview.
The CHM is the inheritor of a collection of artifacts that was begun in the late Sixties, when it was already clear to the authors of the computer revolution that what they were doing would be worth memorializing some day. In its current form, however, the museum dates to a major renovation in 2011, funded by a roster of familiar Silicon Valley companies and foundations. Since then, the political and cultural meaning of the museum’s subject matter has changed a great deal, while the museum itself has seemingly changed very little.
If you have been keeping up with the Twitter and NYT op-ed page discourse on these matters, you might now expect me to write in an appropriately disapproving tone that the CHM conveys the naïvely utopian vision of the internet that prevailed during Barack Obama’s first presidential term. In fact, this is what I expected to find when I first visited the museum, but I was not exactly right. It is true that the CHM is oblivious to nearly all the currently fashionable worries about the consequences of the computer revolution—fake news, radicalized incels, social media addiction, shortening attention spans, data harvesting, mimetic behavioral contagion, etcetera. But neither does it partake in the slack-jawed optimism that many of its principal funders were eagerly projecting in public around the time of the renovation. I and those of my generation without any special interest in the inner workings of computers have always experienced Silicon Valley first of all as a looming edifice of para-political power: a network of institutions that, like the government, are too big to see clearly as they alternately nurture and exploit us. The CHM showed me another, very different side of this cradle of digital modernity—the side that is most obvious to the programmers themselves, I imagine.
A computer, I learned at the CHM, is a device that receives an informational input and then performs some pre-specified series of operations (a program) that issue either in an action or in an informational output. The first part of the museum’s main exhibit is devoted to computers from before the electronic age: counting sticks, abacuses, slide rules, and a partial replica of the nineteenth-century mathematician Charles Babbage’s “difference engine,” a vast mechanical apparatus that could perform the laborious work of calculating tables of logarithms and trigonometric functions.
The difference engine, which required a vast quantity of expensively engineered metal parts, was not built in Babbage’s lifetime, and complex mechanical computing seems to have remained a theoretical curiosity until around the time of World War II. At that point it became clear that high-speed calculating devices that could be programmed—i.e., that could be given different sequences of functions to execute—would be useful for calculating artillery trajectories. Researchers made use of new electronics research and precision engineering methods to build electronic calculating devices, which worked faster than their mechanical analogues and performed far more complex operations. The CHM has a few imposing fragments of the first electronic computer, ENIAC, which was designed for calculating artillery firing tables. A different program on ENIAC was used to test the viability of an early design for the thermonuclear bomb.
These early computers were enormous, hugely expensive, and built to realize the purposes of the state: making war first of all, but also simulating complex social and economic processes for the purpose of exercising centralized administrative control. (To the intellectuals of the period who understood human behavior as finally mechanistic, the computer’s capacity for simulation held out the promise of a perfected social science and perhaps also an infallible statecraft.) The shadowy hulk of ENIAC, with its ranks of vacuum tubes, communicates very effectively the menacing power of those early machines; Orson Welles was quite right to insert one into his chiaroscuro 1962 adaptation of Kafka’s Trial.
The museum devotes a good deal of space to the shrinking of the computer, which, like other technical advances that seem inevitable in retrospect, required much ingenuity from many people. The glittering silicon wafer, etched with a million microscopic circuit patterns, is nothing more than the successor to Babbage’s columns of clicking wheels. Shrinking changed everything: as computers got smaller, cheaper, and more accessible, the Kafka era in computer culture ended. (Needless to say, the Kafka era in computing is still very much with us.) But what followed?
According to one—true—account, the Kafka era was succeeded by what I am inclined to call the Stewart Brand era, after the digital impresario and man-about-Silicon-Valley who probably did more than anyone else to fix the meaning of personal computing in the American public’s imagination. Brand was not a computer man but an ideas man, bursting with big thoughts about how compact, inexpensive computers could change civilization. He is the central character in Fred Turner’s From Counterculture to Cyberculture, which tells the story of the links between the Bay Area’s two great contributions to late modernity: the drugs-and-DayGlo experimentation of the Sixties and the computer revolution. Brand had been part of the Ken Kesey crowd in the Sixties; he helped to organize Kesey’s LSD-soaked Trips Festival, which inaugurated the Age of Aquarius in Haight-Ashbury. Later he founded the Whole Earth Catalogue, a literary-minded compilation of resources for back-to-the-land communalist types.
According to Turner, Brand thought the computer could help to decentralize power by making it easy for ordinary people to access complex information and build lateral networks. He and his associates, influenced by the cybernetic and systems theories of figures like Norbert Wiener and Gregory Bateson, envisioned the cosmos as a computer: a system of integrated and self-regulating systems that responded spontaneously to informational inputs. Personal computing would accelerate the exchange of information and create a denser and more efficient web of connections. Carefully planned, hierarchical orders of direction and control would be replaced by self-regulating networks. In other words, Brand painted the computer as the fulfillment of the complex and perhaps contradictory dream of the Sixties: personal liberation and deep community.
Turner talks about a process of “legitimacy exchange” between counterculturalists and coders: the hippies helped the geeks look cool, while the geeks helped the hippies look smart. If you know what to look for at the Computer History Museum, you can find the traces of legitimacy exchange on the computer products of the Seventies, Eighties, and Nineties. Apple in particular branded itself as countercultural and antiauthoritarian: the bright colors, the casual dress at keynotes, the “1984” ad with the woman throwing a sledgehammer at Big Brother. Steve Jobs was obsessed with Dylan and the Beatles and dated Joan Baez. Meanwhile, the CHM nods at how other Silicon Valley companies like Xerox PARC started bringing beanbag chairs and ping pong tables into the office and giving employees unstructured time to work on high-risk, high-reward projects.
We know where this story goes next. The “Don’t Be Evil” era dawned, bringing Web 2.0, surveillance capitalism, Apple’s spaceship-fortress in Cupertino, indentured servants on H1B visas, million-dollar sheds in Sunnyvale, Amazon warehouses, predictive policing, Big Brother in every pocket, the Metaverse. MacBooks went gunmetal-gray and that sleek Steve Jobs design suddenly started looking sort of fascist. Beanbag chairs at work turned into free dinner at work, which turned into work as an immersive and totalizing environment that the ideal engineer never leaves. Brand was right that computers would create dense lateral connections for the rapid transfer of information, but what he didn’t understand was that this meant consolidation, not decentralization: big networks require big capital to create and maintain, and whoever owns the infrastructure controls what happens on the network and whatever it generates. Or maybe Brand understood this all too well: after all, the major project of his post-Whole Earth years was the Global Business Network, a futurist consulting firm for megacorporations. He also co-founded and still helps to run the Long Now Foundation, which seems to be a vehicle for tech billionaires to undertake grandiose projects that will live on after them—for example, a giant ten-thousand-year clock that Jeff Bezos is building inside a mountain in Texas. This sort of thing is alluring in its own way, but it is not exactly what the communalists of the Sixties had in mind.
Neither Turner (who published From Counterculture in 2006) nor the CHM tells this story, but to my mind one can draw a straight line between the utopianism of the Stewart Brand era to the world-historical dread of the “Don’t Be Evil” era. The Brandians wanted to build globe-spanning, self-administering networks that would simultaneously regulate human order on a grand scale and provide individuals with the kind of care and connection they were longing for in the Sixties. That is exactly what the algorithmic giants are gradually doing. Google plugs you into a vast web linking human and machine, but like the monotheists’ God, it is also personal: the algorithm knows what you need before you ask it.
The most interesting thing about the Computer History Museum is that it is almost completely oblivious to Brandian utopianism. (I was on the lookout for Brand references in the museum, but I found only one, a mention of a Rolling Stone article he wrote about an early computer game.) I have spent most of this article talking about what the CHM is not. Here is what it is: a beautiful tribute to a long line of noble and free-spirited tinkerers that goes back to Charles Babbage. The programmer comes to light at the CHM as a cross between a carpenter and a poet; writing code is making a world out of words.
Thus one of the museum’s exhibits features videos of two famous programmers—Don Knuth, author of a magisterial text called The Art of Computer Programming, and the Netscape and Mozilla veteran Jamie Zawinski—talking about the experience of writing computer code. Knuth, a gentle and eccentric Wisconsin Lutheran who is now in his eighties, seems close to being overcome by emotion as he talks about his first forays into programming. “It didn’t occur to me that I had to program in order to be a happy man,” he says. “Adding a couple of lines to my program gave me a real high. It must be the way poets feel, the way musicians feel.” In another video, he adds, “It was love at first sight. I could sit all night with that machine and play with it.” Zawinski, more didactic, less childishly delighted, still communicates some of the same thrill of creation as he describes his art. “In a lot of ways, computer code is like prose,” he says. “Elegant code is just like a really well-written paragraph. … It’s about style.” Knuth agrees, even seeming to suggest that his intended audience is man as much as machine: “I just want it to be elegant in a way that hangs together so that somebody can read it and smile.”
From Brand’s world-historical point of view, the flowering of personal computing in the latter part of the twentieth century was a vital step in the emergence of a cybernetic novus ordo seclorum. The Computer History Museum portrays that era quite differently: as the opening of a new field of compulsively pleasurable puzzles. There is a reason why the computer geek, the pimpled monomaniac who gives his life to the esoteric joys of coding, has become a recognized type, and it is not just because work with computers pays well. And there is a reason why the humble garage has such an important place in the mythos of Silicon Valley. The personal computer gave a generation—or rather, the small part of a generation that caught the bug and learned to code—the ability to build big things without having to struggle against gross immovable matter. It opened a new sphere for human initiative and creativity that could, once computers became small and cheap enough, be accessed easily by a driven and capable teenager.
I had to learn a bit of coding myself recently, and I got a taste of the flow state that I suppose those teenagers started to discover during the advent of personal computing. I find reading and writing work satisfying in some ultimate sense, but the doing of it is slow and uneven. Sometimes a text unfolds itself to me, or the sentences I need to write come in a rush; sometimes the page repels me and I have to take long breaks or spend quarter-hours going over and over the same lines. When I was coding, however, I discovered that I could concentrate for hours on end late into the night, lengths of consecutive time far exceeding anything I’ve been capable of in my work before or since. The succession of small puzzles, the cycle of tinkering, testing, tinkering, improving, generates a continual stream of chemical rewards, or, in less reductionist terms, an enlivening feeling of incremental progress in the face of meaningful challenge. I also felt some of the stylist’s pride alluded to by Knuth and Zawinski: a sense that my code had a distinct prosody that marked it as mine and that the little immaterial devices I was building bore my imprint.
The hostile interpreter will give this experience a sinister cast: maybe the machines are programming us, maybe the flow state is a chemical displacement from reality as vicious as any drug-induced stupor, maybe the allure of coding is a temptation to a vast and terrible thoughtlessness. Still, there is something richly human about this kind of work, which is not unlike older, more material forms of tinkering. Coding may not call on our most sophisticated deliberative faculties, but whatever building-and-fixing capacity it does engage seems just as deeply worked into us.
The delight of exercising this capacity, not the dream of civilizational transformation and world domination, seems to have animated the Homebrew Computer Club, the Silicon Valley hobbyist group that became the incubator for many of the big names in the later computer industry. The club, which receives an extended treatment in the CHM’s later exhibits, started in yet another Silicon Valley garage, and its anarchic spirit proved difficult to reconcile with the profit motives generated by the computer revolution. As one founding member explains in a museum video, “People who had software would come with a bag full of paper tapes and throw them into the audience to anybody who wanted them. This was sometimes software that they had written, sometimes software that they had found or appropriated or stolen from somebody else. The same was true of hardware.”
Some of those paper tapes contained software written by none other than the young Bill Gates, who expressed his displeasure in “An Open Letter to Hobbyists” that was published in the club’s circular. In retrospect, Gates’s “Open Letter” looks like an emblem of a culture clash between the partisans of computing for the sake of computing and the empire-builders of Microsoft and its competitors. The latter were right to this extent: it would not have been possible to build world-spanning systems of information exchange while respecting the anarchist ethos of the Homebrew Computer Club. Without vast accumulations of capital, economies of scale, and central organs of coordination, it is hard to imagine how personal computing could have reached as many people as it has.
But the hobbyists would not have cared. Don Knuth, for whom programming was a work of love, thought that software should not be patented for the same reason mathematical theorems should not be patented: both, he believed, should be made freely available for others to build on. Though most of the CHM is devoted to the products of large corporations rather than the tinkering of the Homebrew types, its spirit is the Homebrew spirit. The story it tells is about clever people finding fulfillment through the application of creative intelligence to immediate practical problems. This story is perhaps partly or wholly a lie, a screen for a Faustian project of consolidating power or at least making unthinkable sums of money. But the story’s appeal, the glories of the Don Knuth era, are worth trying to understand, if only as a partial answer to the question of why the cleverest people of our time have shown themselves over and over to be childishly oblivious to the most vital problems of politics.
The second time I visited the museum, I brought a friend who was visiting from out of town. That evening he and I hiked up to a hilltop overlooking the cradle of digital modernity. Behind us were green hills rolling down to the Pacific shoreline, the utter West. Far to our left, the towers of San Francisco glimmered faintly silver in the evening light. From some unseen campfire to our right we heard the gentle strains of a guitar. In front of us was what looked like a pleasant wooded valley dotted with bucolic villages, inhabited, no doubt, by a virtuous race of herdsmen and craftsmen, living in peace with the land and one another, marrying and giving in marriage, handing down ancient traditions around their hearths. That idyllic vision and its power over the human heart are perhaps not so far from the vision of the good life that we glimpse at the Computer History Museum.