Orlando used to have a military installation called McCoy Air Force Base, with long runways from which B-52s could take off and reach Cuba, or just about anywhere else, with loads of nukes. But now McCoy has been scrapped and re-purposed. It has been absorbed into Orlando’s civilian airport. The long runways are being used to land 747-loads of tourists from Brazil, Italy, Russia, and Japan, so that they can come to Disney World and steep in our media for a while.

  To traditional cultures, especially word-based ones such as Islam, this is infinitely more threatening than the B-52s ever were. It is obvious, to everyone outside of the United States, that our arch-buzzwords—multiculturalism and diversity—are false fronts that are being used (in many cases unwittingly) to conceal a global trend to eradicate cultural differences. The basic tenet of multiculturalism (or “honoring diversity” or whatever you want to call it) is that people need to stop judging each other—to stop asserting (and, eventually, to stop believing) that this is right and that is wrong, this true and that false, one thing ugly and another thing beautiful, that God exists and has this or that set of qualities.

  The lesson most people are taking home from the twentieth century is that, in order for a large number of different cultures to coexist peacefully on the globe (or even in a neighborhood) it is necessary for people to suspend judgment in this way. Hence (I would argue) our suspicion of, and hostility toward, all authority figures in modern culture. As David Foster Wallace has explained in his essay “E Unibus Pluram,” this is the fundamental message of television; it is the message that people absorb, anyway, after they have steeped in our media long enough. It’s not expressed in these highfalutin terms, of course. It comes through as the presumption that all authority figures—teachers, generals, cops, ministers, politicians—are hypocritical buffoons, and that hip jaded coolness is the only way to be.

  The problem is that once you have done away with the ability to make judgments as to right and wrong, true and false, etc., there’s no real culture left. All that remains is clog dancing and macrame. The ability to make judgments, to believe things, is the entire point of having a culture. I think this is why guys with machine guns sometimes pop up in places like Luxor and begin pumping bullets into Westerners. They perfectly understand the lesson of McCoy Air Force Base. When their sons come home wearing Chicago Bulls caps with the bills turned sideways, the dads go out of their minds.

  The global anticulture that has been conveyed into every cranny of the world by television is a culture unto itself, and by the standards of great and ancient cultures like Islam and France, it seems grossly inferior, at least at first. The only good thing you can say about it is that it makes world wars and Holocausts less likely—and that is actually a pretty good thing!

  The only real problem is that anyone who has no culture, other than this global monoculture, is completely screwed. Anyone who grows up watching TV, never sees any religion or philosophy, is raised in an atmosphere of moral relativism, learns about civics from watching bimbo eruptions on network TV news, and attends a university where postmodernists vie to outdo each other in demolishing traditional notions of truth and quality, is going to come out into the world as one pretty feckless human being. And—again—perhaps the goal of all this is to make us feckless so we won’t nuke each other.

  On the other hand, if you are raised within some specific culture, you end up with a basic set of tools that you can use to think about and understand the world. You might use those tools to reject the culture you were raised in, but at least you’ve got some tools.

  In this country, the people who run things—who populate major law firms and corporate boards—understand all of this at some level. They pay lip service to multiculturalism and diversity and nonjudgmentalness, but they don’t raise their own children that way. I have highly educated, technically sophisticated friends who have moved to small towns in Iowa to live and raise their children, and there are Hasidic Jewish enclaves in New York where large numbers of kids are being brought up according to traditional beliefs. Any suburban community might be thought of as a place where people who hold certain (mostly implicit) beliefs go to live among others who think the same way.

  And not only do these people feel some responsibility to their own children, but to the country as a whole. Some of the upper class are vile and cynical, of course, but many spend at least part of their time fretting about what direction the country is going in and what responsibilities they have. And so issues that are important to book-reading intellectuals, such as global environmental collapse, eventually percolate through the porous buffer of mass culture and show up as ancient Hindu ruins in Orlando.

  You may be asking: what the hell does all this have to do with operating systems? As I’ve explained, there is no way to explain the domination of the OS market by Apple/Microsoft without looking to cultural explanations, and so I can’t get anywhere, in this essay, without first letting you know where I’m coming from vis-à-vis contemporary culture.

  Contemporary culture is a two-tiered system, like the Morlocks and the Eloi in H. G. Wells’s The Time Machine, except that it’s been turned upside down. In The Time Machine, the Eloi were an effete upper class, supported by lots of subterranean Morlocks who kept the technological wheels turning. But in our world it’s the other way round. The Morlocks are in the minority, and they are running the show, because they understand how everything works. The much more numerous Eloi learn everything they know from being steeped from birth in electronic media directed and controlled by book-reading Morlocks. That many ignorant people could be dangerous if they got pointed in the wrong direction, and so we’ve evolved a popular culture that is (a) almost unbelievably infectious, and (b) neuters every person who gets infected by it, by rendering them unwilling to make judgments and incapable of taking stands.

  Morlocks, who have the energy and intelligence to comprehend details, go out and master complex subjects and produce Disney-like Sensorial Interfaces so that Eloi can get the gist without having to strain their minds or endure boredom. Those Morlocks will go to India and tediously explore a hundred ruins, then come home and build sanitary bug-free versions: highlight films, as it were. This costs a lot, because Morlocks insist on good coffee and first-class airline tickets, but that’s no problem, because Eloi like to be dazzled and will gladly pay for it all.

  Now I realize that most of this probably sounds snide and bitter to the point of absurdity: your basic snotty intellectual throwing a tantrum about those unlettered philistines. As if I were a self-styled Moses, coming down from the mountain all alone, carrying the stone tablets bearing the Ten Commandments carved in immutable stone—the original command line interface—and blowing his stack at the weak, unenlightened Hebrews worshipping images. Not only that, but it sounds like I’m pumping some sort of conspiracy theory.

  But that is not where I’m going with this. The situation I describe here could be bad, but doesn’t have to be bad and isn’t necessarily bad now.

  It simply is the case that we are way too busy, nowadays, to comprehend everything in detail. And it’s better to comprehend it dimly, through an interface, than not at all. Better for ten million Eloi to go on the Kilimanjaro Safari at Disney World than for a thousand cardiovascular surgeons and mutual fund managers to go on “real” ones in Kenya. The boundary between these two classes is more porous than I’ve made it sound. I’m always running into regular dudes—construction workers, auto mechanics, taxi drivers, galoots in general—who were largely aliterate until something made it necessary for them to become readers and start actually thinking about things. Perhaps they had to come to grips with alcoholism, perhaps they got sent to jail, or came down with a disease, or suffered a crisis in religious faith, or simply got bored. Such people can get up to speed on particular subjects quite rapidly. Sometimes their lack of a broad education makes them overapt to go off on intellectual wild-goose chases, but hey, at least a wild-goose chase gives you some exercise. The spectre of a polity con
trolled by the fads and whims of voters who actually believe that there are significant differences between Bud Lite and Miller Lite, and who think that professional wrestling is for real, is naturally alarming to people who don’t. But then countries controlled via the command line interface, as it were, by double-domed intellectuals, be they religious or secular, are generally miserable places to live.

  Sophisticated people deride Disneyesque entertainments as pat and saccharine, but if the result of that is to instill basically warm and sympathetic reflexes, at a preverbal level, into hundreds of millions of unlettered media-steepers, then how bad can it be? We killed a lobster in our kitchen last night and my daughter cried for an hour. The Japanese, who used to be just about the fiercest people on earth, have become infatuated with cuddly, adorable cartoon characters. My own family—the people I know best—is divided about evenly between people who will probably read this essay and people who almost certainly won’t, and I can’t say for sure that one group is necessarily warmer, happier, or better-adjusted than the other.

  MORLOCKS AND ELOI AT THE KEYBOARD

  Back in the days of the command line interface, users were all Morlocks who had to convert their thoughts into alphanumeric symbols and type them in, a grindingly tedious process that stripped away all ambiguity, laid bare all hidden assumptions, and cruelly punished laziness and imprecision. Then the interface-makers went to work on their GUIs and introduced a new semiotic layer between people and machines. People who use such systems have abdicated the responsibility, and surrendered the power, of sending bits directly to the chip that’s doing the arithmetic, and handed that responsibility and power over to the OS. This is tempting, because giving clear instructions, to anyone or anything, is difficult. We cannot do it without thinking, and depending on the complexity of the situation, we may have to think hard about abstract things, and consider any number of ramifications, in order to do a good job of it. For most of us, this is hard work. We want things to be easier. How badly we want it can be measured by the size of Bill Gates’s fortune.

  The OS has (therefore) become a sort of intellectual labor-saving device that tries to translate humans’ vaguely expressed intentions into bits. In effect we are asking our computers to shoulder responsibilities that have always been considered the province of human beings—we want them to understand our desires, to anticipate our needs, to foresee consequences, to make connections, to handle routine chores without being asked, to remind us of what we ought to be reminded of while filtering out noise.

  At the upper (which is to say, closer to the user) levels, this is done through a set of conventions—menus, buttons, and so on. These work in the sense that analogies work: they help Eloi understand abstract or unfamiliar concepts by likening them to something known. But the loftier word “metaphor” is used.

  The overarching concept of the MacOS was the “desktop metaphor,” and it subsumed any number of lesser (and frequently conflicting, or at least mixed) metaphors. Under a GUI, a file (frequently called “document”) is metaphrased as a window on the screen (which is called a “desktop”). The window is almost always too small to contain the document and so you “move around,” or, more pretentiously, “navigate” in the document by “clicking and dragging” the “thumb” on the “scroll bar.” When you “type” (using a keyboard) or “draw” (using a “mouse”) into the “window” or use pull-down “menus” and “dialog boxes” to manipulate its contents, the results of your labors get stored (at least in theory) in a “file,” and later you can pull the same information back up into another “window.” When you don’t want it anymore, you “drag” it into the “trash.”

  There is massively promiscuous metaphor-mixing going on here, and I could deconstruct it till the cows come home, but I won’t. Consider only one word: “document.” When we document something in the real world, we make fixed, permanent, immutable records of it. But computer documents are volatile, ephemeral constellations of data. Sometimes (as when you’ve just opened or saved them) the document as portrayed in the window is identical to what is stored, under the same name, in a file on the disk, but other times (as when you have made changes without saving them) it is completely different. In any case, every time you hit “Save” you annihilate the previous version of the “document” and replace it with whatever happens to be in the window at the moment. So even the word “save” is being used in a sense that is grotesquely misleading—“destroy one version, save another” would be more accurate.

  Anyone who uses a word processor for very long inevitably has the experience of putting hours of work into a long document and then losing it because the computer crashes or the power goes out. Until the moment that it disappears from the screen, the document seems every bit as solid and real as if it had been typed out in ink on paper. But in the next moment, without warning, it is completely and irretrievably gone, as if it had never existed. The user is left with a feeling of disorientation (to say nothing of annoyance) stemming from a kind of metaphor shear—you realize that you’ve been living and thinking inside of a metaphor that is essentially bogus.

  So GUIs use metaphors to make computing easier, but they are bad metaphors. Learning to use them is essentially a word game, a process of learning new definitions of words such as “window” and “document” and “save” that are different from, and in many cases almost diametrically opposed to, the old. Somewhat improbably, this has worked very well, at least from a commercial standpoint, which is to say that Apple/Microsoft have made a lot of money off of it. All of the other modern operating systems have learned that in order to be accepted by users they must conceal their underlying gutwork beneath the same sort of spackle. This has some advantages: if you know how to use one GUI operating system, you can probably work out how to use any other in a few minutes. Everything works a little differently, like European plumbing—but with some fiddling around, you can type a memo or surf the web.

  Most people who shop for OSes (if they bother to shop at all) are comparing not the underlying functions but the superficial look and feel. The average buyer of an OS is not really paying for, and is not especially interested in, the low-level code that allocates memory or writes bytes onto the disk. What we’re really buying is a system of metaphors. And—much more important—what we’re buying into is the underlying assumption that metaphors are a good way to deal with the world.

  Recently a lot of new hardware has become available that gives computers numerous interesting ways of affecting the real world: making paper spew out of printers, causing words to appear on screens thousands of miles away, shooting beams of radiation through cancer patients, creating realistic moving pictures of the Titanic. Windows is now used as an OS for cash registers and bank tellers’ terminals. My satellite TV system uses a sort of GUI to change channels and show program guides. Modern cellular telephones have a crude GUI built into a tiny LCD screen. Even Legos now have a GUI: you can buy a Lego set called Mindstorms that enables you to build little Lego robots and program them through a GUI on your computer. So we are now asking the GUI to do a lot more than serve as a glorified typewriter. Now we want it to become a generalized tool for dealing with reality. This has become a bonanza for companies that make a living out of bringing new technology to the mass market.

  Obviously you cannot sell a complicated technological system to people without some sort of interface that enables them to use it. The internal combustion engine was a technological marvel in its day, but useless as a consumer good until a clutch, transmission, steering wheel, and throttle were connected to it. That odd collection of gizmos, which survives to this day in every car on the road, made up what we would today call a user interface. But if cars had been invented after Macintoshes, carmakers would not have bothered to gin up all of these arcane devices. We would have a computer screen instead of a dashboard, and a mouse (or at best a joystick) instead of a steering wheel, and we’d shift gears by pulling down a menu:

  PARK

  REV
ERSE

  NEUTRAL

  3

  2

  1

  Help…

  A few lines of computer code can thus be made to substitute for any imaginable mechanical interface. The problem is that in many cases the substitute is a poor one. Driving a car through a GUI would be a miserable experience. Even if the GUI were perfectly bug-free, it would be incredibly dangerous, because menus and buttons simply can’t be as responsive as direct mechanical controls. My friend’s dad, the gentleman who was restoring the MGB, never would have bothered with it if it had been equipped with a GUI. It wouldn’t have been any fun.

  The steering wheel and gearshift lever were invented during an era when the most complicated technology in most homes was a butter churn. Those early carmakers were simply lucky, in that they could dream up whatever interface was best suited to the task of driving an automobile, and people would learn it. Likewise with the dial telephone and the AM radio. By the time of the Second World War, most people knew several interfaces: they could not only churn butter but also drive a car, dial a telephone, turn on a radio, summon flame from a cigarette lighter, and change a lightbulb.