In the Beginning...Was the Command Line
Even the hardware that Windows ran on, when compared to the machines put out by Apple, looked like white-trash stuff, and still mostly does. The reason was that Apple was and is a hardware company, while Microsoft was and is a software company. Apple therefore had a monopoly on hardware that could run MacOS, whereas Windows-compatible hardware came out of a free market. The free market seems to have decided that people will not pay for cool-looking computers; PC hardware makers who hire designers to make their stuff look distinctive get their clocks cleaned by Taiwanese clone makers punching out boxes that look as if they belong on cinderblocks in front of someone’s trailer. Apple, on the other hand, could make their hardware as pretty as they wanted to and simply pass the higher prices on to their besotted consumers, like me. Only last week (I am writing this sentence in early January 1999) the technology sections of all the newspapers were filled with adulatory press coverage of how Apple had released the iMac in several happenin’ new colors like Blueberry and Tangerine.
Apple has always insisted on having a hardware monopoly, except for a brief period in the mid-1990s when they allowed clone-makers to compete with them, before subsequently putting them out of business. Macintosh hardware was, consequently, expensive. You didn’t open it up and fool around with it because doing so would void the warranty. In fact, the first Mac was specifically designed to be difficult to open—you needed a kit of exotic tools, which you could buy through little ads that began to appear in the back pages of magazines a few months after the Mac came out on the market. These ads always had a certain disreputable air about them, like pitches for lock-picking tools in the backs of lurid detective magazines.
This monopolistic policy can be explained in at least three different ways.
The charitable explanation is that the hardware monopoly policy reflected a drive on Apple’s part to provide a seamless, unified blending of hardware, operating system, and software. There is something to this. It is hard enough to make an OS that works well on one specific piece of hardware, designed and tested by engineers who work down the hallway from you, in the same company. Making an OS to work on arbitrary pieces of hardware, cranked out by rabidly entrepreneurial clonemakers on the other side of the international date line, is very difficult and accounts for much of the troubles people have using Windows.
The financial explanation is that Apple, unlike Microsoft, is and always has been a hardware company. It simply depends on revenue from selling hardware, and cannot exist without it.
The not-so-charitable explanation has to do with Apple’s corporate culture, which is rooted in Bay Area Baby Boomdom.
Now, since I’m going to talk for a moment about culture, full disclosure is probably in order, to protect myself against allegations of conflict of interest and ethical turpitude: (1) Geographically I am a Seattleite, of a Saturnine temperament, and inclined to take a sour view of the Dionysian Bay Area, just as they tend to be annoyed and appalled by us. (2) Chronologically I am post-Baby Boom. I feel that way, at least, because I never experienced the fun and exciting parts of the whole Boomer scene—just spent a lot of time dutifully chuckling at Boomers’ maddeningly pointless anecdotes about just how stoned they got on various occasions, and politely fielding their assertions about how great their music was. But even from this remove it was possible to glean certain patterns. One that recurred as regularly as an urban legend was about how someone would move into a commune populated by sandal-wearing, peace-sign-flashing flower children and eventually discover that, underneath this facade, the guys who ran it were actually control freaks; and that, as living in a commune, where much lip service was paid to ideals of peace, love, and harmony had deprived them of normal, socially approved outlets for their control-freakdom, it tended to come out in other, invariably more sinister, ways.
Applying this to the case of Apple Computer will be left as an exercise for the reader, and not a very difficult exercise.
It is a bit unsettling, at first, to think of Apple as a control freak, because it is completely at odds with their corporate image. Weren’t these the guys who aired the famous Super Bowl ads showing suited, blindfolded executives marching like lemmings off a cliff? Isn’t this the company that even now runs ads picturing the Dalai Lama (except in Hong Kong) and Einstein and other offbeat rebels?
It is indeed the same company, and the fact that they have been able to plant this image of themselves as creative and rebellious freethinkers in the minds of so many intelligent and media-hardened skeptics really gives one pause. It is testimony to the insidious power of expensive slick ad campaigns and, perhaps, to a certain amount of wishful thinking in the minds of people who fall for them. It also raises the question of why Microsoft is so bad at PR, when the history of Apple demonstrates that by writing large checks to good ad agencies, you can plant a corporate image in the minds of intelligent people that is completely at odds with reality. (The answer, for people who don’t like Damoclean questions, is that since Microsoft has won the hearts and minds of the silent majority—the bourgeoisie—they don’t give a damn about having a slick image, any more then Dick Nixon did. “I want to believe”—the mantra that Fox Mulder has pinned to his office wall in The X-Files—applies in different ways to these two companies: Mac partisans want to believe in the image of Apple purveyed in those ads, and in the notion that Macs are somehow fundamentally different from other computers, while Windows people want to believe that they are getting something for their money, engaging in a respectable business transaction.)
In any event, as of 1987, both MacOS and Windows were out on the market, running on hardware platforms that were radically different from each other, not only in the sense that MacOS used Motorola CPU chips while Windows used Intel, but in the sense—then overlooked, but in the long run, vastly more significant—that the Apple hardware business was a rigid monopoly and the Windows side was a churning free-for-all.
But the full ramifications of this did not become clear until very recently—in fact, they are still unfolding, in remarkably strange ways, as I’ll explain when we get to Linux. The upshot is that millions of people got accustomed to using GUIs in one form or another. By doing so, they made Apple/Microsoft a lot of money. The fortunes of many people have become bound up with the ability of these companies to continue selling products whose salability is very much open to question.
HONEY-POT, TAR-PIT, WHATEVER
When Gates and Allen invented the idea of selling software, they ran into criticism from both hackers and sober-sided businesspeople. Hackers understood that software was just information and objected to the idea of selling it. These objections were partly moral. The hackers were coming out of the scientific and academic world, where it is imperative to make the results of one’s work freely available to the public. They were also partly practical: how can you sell something that can be easily copied? Businesspeople, who are polar opposites of hackers in so many ways, had objections of their own. Accustomed to selling toasters and insurance policies, they naturally had a difficult time understanding how a long collection of ones and zeroes could constitute a salable product.
Obviously Microsoft prevailed over these objections, and so did Apple. But the objections still exist. The most hackerish of all the hackers, the Ur-hacker, as it were, was and is Richard Stallman, who became so annoyed with the evil practice of selling software that in 1984 (the same year that the Macintosh went on sale) he went off and founded something called the Free Software Foundation, which commenced work on something called GNU. GNU is an acronym for Gnu’s Not Unix, but this is a joke in more ways than one, because GNU most certainly is a functional replacement for Unix. Because of copyright concerns (“Unix” is trademarked, and the programs it comprises are copyrighted, by AT&T) they simply could not claim that it was Unix, and so, just to be extra safe, they asserted that it wasn’t. Notwithstanding the incomparable talent and drive possessed by Mr. Stallman and other GNU adherents, their project to build a free Unix was a little bit like trying
to dig a subway system with a teaspoon. Until, that is, the advent of Linux.*
But the basic idea of recreating an operating system from scratch was perfectly sound and completely doable. It has been done many times. It is inherent in the very nature of operating systems.
Operating systems are not strictly necessary. There is no reason why a sufficiently dedicated coder could not start from nothing with every project and write fresh code to handle such basic, low-level operations as controlling the read/write heads on the disk drives and lighting up pixels on the screen. The very first computers had to be programmed in this way. But since nearly every program needs to carry out those same basic operations, this approach would lead to vast duplication of effort.
Nothing is more disagreeable to the hacker than duplication of effort. The first and most important mental habit that people develop when they learn how to write computer programs is to generalize, generalize, generalize. To make their code as modular and flexible as possible, breaking large problems down into small subroutines that can be used over and over again in different contexts. Consequently, the development of operating systems, despite being technically unnecessary, was inevitable. Because at its heart, an operating system is nothing more than a library containing the most commonly used code, written once (and hopefully written well), and then made available to every coder who needs it.
So a proprietary, closed, secret operating system is a contradiction in terms. It goes against the whole point of having an operating system. And it is impossible to keep them secret anyway. The source code—the original lines of text written by the programmers—can be kept secret. But an OS as a whole is a collection of small subroutines that do very specific, very clearly defined jobs. Exactly what those subroutines do has to be made public, quite explicitly and exactly, or else the OS is completely useless to programmers; they can’t make use of those subroutines if they don’t have a complete and perfect understanding of what the subroutines do.
The only thing that isn’t made public is exactly how the subroutines do what they do. But once you know what a subroutine does, it’s generally quite easy (if you are a hacker) to write one of your own that does exactly the same thing. It might take a while, and it is tedious and unrewarding, but in most cases it’s not really hard.
What’s hard, in hacking as in fiction, is not the writing; it’s deciding what to write. And the vendors of commercial OSes have already decided, and published their decisions.
This has been generally understood for a long time. MS-DOS was duplicated, functionally, by a rival product, written from scratch, called ProDOS, that did all of the same things in pretty much the same way. In other words, another company was able to write code that did all of the same things as MS-DOS and sell it at a profit. If you are using the Linux OS, you can get a free program called WINE, which is a Windows emulator; that is, you can open up a window on your desktop that runs Windows programs. It means that a completely functional Windows OS has been recreated inside of Unix, like a ship in a bottle. And Unix itself, which is vastly more sophisticated than MS-DOS, has been built up from scratch many times over. Versions of it are sold by Sun, Hewlett-Packard, AT&T, Silicon Graphics, IBM, and others.
People have, in other words, been rewriting basic OS code for so long that all of the technology that constituted an “operating system” in the traditional (pre-GUI) sense of that phrase is now so cheap and common that it’s literally free. Not only could Gates and Allen not sell MS-DOS today, they could not even give it away, because much more powerful OSes are already being given away. Even the original Windows has become worthless, in that there is no point in owning something that can be emulated inside of Linux—which is, itself, free.
In this way the OS business is very different from, say, the car business. Even an old rundown car has some value. You can use it for making runs to the dump, or strip it for parts. It is the fate of manufactured goods to slowly and gently depreciate as they get old and have to compete against more modern products.
But it is the fate of operating systems to become free.
Microsoft is a great software applications company. Applications—such as Microsoft Word—are an area where innovation brings real, direct, tangible benefits to users. The innovations might be new technology straight from the research department, or they might be in the category of bells and whistles, but in any event they are frequently useful and they seem to make users happy. And Microsoft is in the process of becoming a great research company. But Microsoft is not such a great operating systems company. This is not necessarily because their operating systems are all that bad from a purely technological standpoint. Microsoft’s OSes do have their problems, sure, but they are vastly better than they used to be, and they are adequate for most people.
Why, then, do I say that Microsoft is not such a great operating systems company? Because the very nature of operating systems is such that it is senseless for them to be developed and owned by a specific company. It’s a thankless job to begin with. Applications create possibilities for millions of credulous users, whereas OSes impose limitations on thousands of grumpy coders, and so OS-makers will forever be on the shit-list of anyone who counts for anything in the high-tech world. Applications get used by people whose big problem is understanding all of their features, whereas OSes get hacked by coders who are annoyed by their limitations. The OS business has been good to Microsoft only insofar as it has given them the money they needed to launch a really good applications software business and to hire a lot of smart researchers. Now it really ought to be jettisoned, like a spent booster stage from a rocket. The big question is whether Microsoft is capable of doing this. Or is it addicted to OS sales in the same way as Apple is to selling hardware?
Keep in mind that Apple’s ability to monopolize its own hardware supply was once cited, by learned observers, as a great advantage over Microsoft. At the time, it seemed to place them in a much stronger position. In the end, it nearly killed them, and may kill them yet. The problem, for Apple, was that most of the world’s computer users ended up owning cheaper hardware. But cheap hardware couldn’t run MacOS, and so these people switched to Windows.
Replace “hardware” with “operating systems,” and “Apple” with “Microsoft,” and you can see the same thing about to happen all over again. Microsoft dominates the OS market, which makes them money and seems like a great idea for now. But cheaper and better OSes are available, and they are growingly popular in parts of the world that are not so saturated with computers as the U.S. Ten years from now, most of the world’s computer users may end up owning these cheaper OSes. But these OSes do not, for the time being, run any Microsoft applications, and so these people will use something else.
To put it more directly: every time someone decides to use a non-Microsoft OS, Microsoft’s OS division, obviously, loses a customer. But, as things stand now, Microsoft’s applications division loses a customer too. This is not such a big deal as long as almost everyone uses Microsoft OSes. But as soon as Windows’ market share begins to slip, the math starts to look pretty dismal for the people in Redmond.
This argument could be countered by saying that Microsoft could simply recompile its applications to run under other OSes. But this strategy goes against most normal corporate instincts. Again the case of Apple is instructive. When things started to go south for Apple, they should have ported their OS to cheap PC hardware. But they didn’t. Instead, they tried to make the most of their brilliant hardware, adding new features and expanding the product line. But this only had the effect of making their OS more dependent on these special hardware features, which made it worse for them in the end.
Likewise, when Microsoft’s position in the OS world is threatened, their corporate instincts will tell them to pile more new features into their operating systems, and then re-jigger their software applications to exploit those special features. But this will only have the effect of making their applications dependent on an OS with declining market
share, and make it worse for them in the end.
The operating system market is a death trap, a tar pit, a slough of despond. There are only two reasons to invest in Apple and Microsoft. (1) Each of these companies is in what we would call a codependency relationship with their customers. The customers Want To Believe, and Apple and Microsoft know how to give them what they want. (2) Each company works very hard to add new features to their OSes, which works to secure customer loyalty, at least for a little while.
Accordingly, most of the remainder of this essay will be about those two topics.
THE TECHNOSPHERE
Unix is the only OS remaining whose GUI (a vast suite of code called the XWindow System) is separate from the OS in the old sense of the phrase. This is to say that you can run Unix in pure command line mode if you want to, with no windows, icons, mouses, etc. whatsoever, and it will still be Unix and capable of doing everything Unix is supposed to do. But the other OSes—MacOS, the Windows family, and BeOS—have their GUIs tangled up with the old-fashioned OS functions to the extent that they have to run in GUI mode, or else they are not really running. So it’s no longer really possible to think of GUIs as being distinct from the OS; they’re now an inextricable part of the OSes that they belong to—and they are by far the largest part, and by far the most expensive and difficult part to create.