In The Plex
PART ONE
THE WORLD ACCORDING TO GOOGLE
Biography of a Search Engine
1
“It was science fiction more than computer science.”
On February 18, 2010, Judge Denny Chin of the New York Southern District federal court took stock of the packed gallery in Courtroom 23B. It was going to be a long day. He was presiding over a hearing that would provide only a gloss to hundreds of submissions he had already received on this case. “There is just too much to digest,” he said. He shook his head, preparing himself to hear the arguments of twenty-seven representatives of various interest groups or corporations, as well as presentations by some of the lawyers for various parties, lawyers who filled every place in two long tables before him.
The case was The Authors Guild, Inc., Association of American Publishers, et al. v. Google Inc. It was a lawsuit tentatively resolved by a class settlement agreement in which an authors’ group and a publishers’ association set conditions for a technology company to scan and sell books. Judge Chin’s decision would involve important issues affecting the future of digital works, and some of the speakers before the court engaged on those issues. But many of the objectors—and most who addressed the court were objectors to the settlement—focused on a young company headquartered on a sprawling campus in Mountain View, California. That company was Google. The speakers seemed to distrust it, fear it, even despise it.
“A major threat to … freedom of expression and participation in cultural diversity”
“An unjustified monopoly”
“Eviscerates privacy protections”
“Concealment and misdirection”
“Price fixing … a massive market distortion … preying on the desperate”
“May well be a per se violation of the antitrust laws”
(That last statement held special weight, as it came from the U.S. deputy assistant attorney general.)
But the federal government was only one of Google’s surprising opponents. Some of the others were supporters of the public interest, monitoring the privacy rights and pocketbooks of citizens. Others were advocates of free speech. There was even an objector representing the folk-singer Arlo Guthrie.
The irony was that Google itself explicitly embraced the lofty values and high moral standards that it was being attacked for flouting. Its founders had consistently stated that their goal was to make the world better, specifically by enabling humanity’s access to information. Google had created an astonishing tool that took advantage of the interconnected nature of the burgeoning World Wide Web, a tool that empowered people to locate even obscure information within seconds. This search engine transformed the way people worked, entertained themselves, and learned. Google made historic profits from that product by creating a new form of advertising—nonintrusive and even useful. It hired the sharpest minds in the world and encouraged them to take on challenges that pushed the boundaries of innovation. Its focus on engineering talent to accomplish difficult goals was a national inspiration. It even warned its shareholders that the company would sometimes pursue business practices that serve humanity even at the expense of lower profits. It accomplished all those achievements with a puckish irreverence that captivated the public and made heroes of its employees.
But that didn’t matter to the objectors in Judge Chin’s courtroom. Those people were Google’s natural allies, and they thought that Google was no longer … good. The mistrust and fear in the courtroom were reflected globally by governments upset by Google’s privacy policies and businesses worried that Google’s disruptive practices would target them next. Everywhere Google’s executives turned, they were faced with protests and lawsuits.
The course of events was baffling to Google’s two founders, Larry Page and Sergey Brin. Of all Google’s projects, the one at issue in the hearing—Google’s Book Search project—was perhaps the most idealistic. It was an audacious attempt to digitize every book every printed, so that anyone in the world could locate the information within. Google would not give away the full contents of the books, so when users discovered them, they would have reason to buy them. Authors would have new markets; readers would have instant access to knowledge. After being sued by publishers and authors, Google made a deal with them that would make it even easier to access the books and to buy them on the spot. Every library would get a free terminal to connect to the entire corpus of the world’s books. To Google, it was a boon to civilization.
Didn’t people understand?
By all metrics, the company was still thriving. Google still retained its hundreds of millions of users, hosted billions of searches every day, and had growing businesses in video and wireless devices. Its employees were still idealistic and ambitious in the best sense. But a shadow now darkened Google’s image. To many outsiders, the corporate motto that Google had taken seriously—“Don’t be evil”—had become a joke, a bludgeon to be used against it.
What had happened?
Doing good was Larry Page’s plan from the very beginning. Even as a child, he wanted to be an inventor, not simply because his mind aligned perfectly with the nexus of logic and technology (which it did) but because, he says, “I really wanted to change the world.”
Page grew up in Lansing, Michigan, where his father taught computer science at Michigan State. His parents divorced when he was eight, but he was close with both his father and mother—who had her own computer science degree. Naturally, he spoke computers as a primary language. As he later told an interviewer, “I think I was the first kid in my elementary school to turn in a word-processed document.”
Page was not a social animal—people who talked to him often wondered if there were a jigger of Asperger’s in the mix—and could unnerve people by simply not talking. But when he did speak, more often than not he would come out with ideas that bordered on the fantastic. Attending a summer program in leadership (motto: “A healthy disregard for the impossible”) helped move him to action. At the University of Michigan, he became obsessed with transportation and drew up plans for an elaborate monorail system in Ann Arbor, replacing the mundane bus system with a “futuristic” commute between the dorms and the classrooms. It seemed to come as a surprise to him that a fanciful multimillion-dollar transit fantasy from an undergraduate would not be quickly embraced and implemented. (Fifteen years after he graduated, Page would bring up the issue again in a meeting with the university’s president.)
His intelligence and imagination were clear. But when you got to know him, what stood out was his ambition. It expressed itself not as a personal drive (though there was that, too) but as a general principle that everyone should think big and then make big things happen. He believed that the only true failure was not attempting the audacious. “Even if you fail at your ambitious thing, it’s very hard to fail completely,” he says. “That’s the thing that people don’t get.” Page always thought about that. When people proposed a short-term solution, Page’s instinct was to think long term. There would eventually be a joke among Googlers that Page “went to the future and came back to tell us about it.”
Page earned a degree in computer science like his father did. But his destiny was in California, specifically in the Silicon Valley. In a way, Page’s arrival at Stanford was a homecoming. He’d lived there briefly in 1979 when his dad had spent a sabbatical at Stanford; some faculty members still remembered him as an insatiably curious seven-year-old. In 1995, Stanford was not only the best place to pursue cutting-edge computer science but, because of the Internet boom, was also the world capital of ambition. Fortunately, Page’s visions extended to the commercial: “Probably from when I was twelve, I knew I was going to start a company eventually,” he’d later say. Page’s brother, nine years older, was already in Silicon Valley, working for an Internet start-up.
Page chose to work in the department’s Human-Computer Interaction Group. The subject would stand Page in good stead in the future with respect to product development, even though it w
as not in the HCI domain to figure out a new model of information retrieval. On his desk and permeating his conversations was Apple interface guru Donald Norman’s classic tome The Psychology of Everyday Things, the bible of a religion whose first, and arguably only, commandment is “The user is always right.” (Other Norman disciples, such as Jeff Bezos at Amazon.com, were adopting this creed on the web.) Another influential book was a biography of Nikola Tesla, the brilliant Serb scientist; though Tesla’s contributions arguably matched Thomas Edison’s—and his ambitions were grand enough to impress even Page—he died in obscurity. “I felt like he was a great inventor and it was a sad story,” says Page. “I feel like he could’ve accomplished much more had he had more resources. And he had trouble commercializing the stuff he did. Probably more trouble than he should’ve had. I think that was a good lesson. I didn’t want to just invent things, I also wanted to make the world better, and in order to do that, you need to do more than just invent things.”
The summer before entering Stanford, Page attended a program for accepted candidates that included a tour of San Francisco. The guide was a grad student Page’s age who’d been at Stanford for two years. “I thought he was pretty obnoxious,” Page later said of the guide, Sergey Brin. The content of the encounter is now relegated to legend, but their argumentative banter was almost certainly good-natured. Despite the contrast in personalities, in some ways they were twins. Both felt most comfortable in the meritocracy of academia, where brains trumped everything else. Both had an innate understanding of how the ultraconnected world that they enjoyed as computer science (CS) students was about to spread throughout society. Both shared a core belief in the primacy of data. And both were rock stubborn when it came to pursuing their beliefs. When Page settled in that September, he became close friends with Brin, to the point where people thought of them as a set: LarryAndSergey.
Born in Russia, Brin was four when his family immigrated to the United States. His English still maintained a Cyrillic flavor, and his speech was dotted with anachronistic Old World touches such as the use of “what-not” when peers would say “stuff like that.” He had arrived at Stanford at nineteen after whizzing through the University of Maryland, where his father taught, in three years; he was one of the youngest students ever to start the Stanford PhD program. “He skipped a million years,” says Craig Silverstein, who arrived at Stanford a year later, and would eventually become Google’s first employee. Sergey was a quirky kid who would zip through Stanford’s hallways on omnipresent Rollerblades. He also had an interest in trapeze. But the professors understood that behind the goofiness was a formidable mathematical mind. Soon after arriving at Stanford, he knocked off all the required tests for a doctorate and was free to sample the courses until he found a suitable entree for a thesis. He supplemented his academics with swimming, gymnastics, and sailing. (When his father asked him in frustration whether he planned to take advanced courses, he said that he might take advanced swimming.) Donald Knuth, a Stanford professor whose magisterial series of books on the art of computer programming made him the Proust of computer code, recalls driving down the Pacific coast to a conference with Sergey one afternoon and being impressed at his grasp of complicated issues. His adviser, Hector Garcia-Molina, had seen a lot of bright kids go through Stanford, but Brin stood out. “He was brilliant,” Garcia-Molina says.
One task that Brin took on was a numbering scheme for the new Gates Computer Science Building, which was to be the home of the department. (His system used mathematical flourishes.) The structure was named after William Henry Gates III, better known as Bill, the cofounder of Microsoft. Though Gates had spent a couple of years at Harvard and endowed a building named after his mother there, he went on a small splurge of funding palatial new homes for computer science departments at top technical institutions that he didn’t attend, including MIT and Carnegie Mellon—along with Stanford, the trifecta of top CS programs. Even as they sneered at Windows, the next generation of wizards would study in buildings named after Bill Gates.
Did Gates ever imagine that one of those buildings would incubate a rival that might destroy Microsoft?
The graduate computer science program at Stanford was built around close relationships between students and faculty members. They would team up to work on big, real-world problems; the fresh perspective of the young people maintains the vitality of the professor’s interests. “You always follow the students,” says Terry Winograd, who was Page’s adviser. (Page would often remind him that they had met during his dad’s Stanford sabbatical.) Over the years Winograd had become an expert at figuring out where students stood on the spectrum of brainiacs who found their way into the department. Some were kids whose undergrad record was straight A pluses, GRE scores scraping perfection, who would come in and say, “What thesis should I work on?” On the other end of the spectrum were kids like Larry Page, who would come in and say, “Here’s what I think I can do.” And his proposals were crazy. He’d come into the office and talk about doing something with space tethers or solar kites. “It was science fiction more than computer science,” recalls Winograd. But an outlandish mind was a valuable asset, and there was definitely a place in the current science to channel wild creativity.
In 1995, that place was the World Wide Web. It had sprung from the restless brain of a (then)-obscure British engineer named Tim Berners-Lee, who was working as a technician at the CERN physics research lab in Switzerland. Berners-Lee could sum up his vision in a sentence: “Suppose all the information stored on computers everywhere were linked … there would be a single global information space.”
The web’s pedigree could be traced back to a 1945 paper by the American scientist Vannevar Bush. Entitled “As We May Think,” it outlined a vast storage system called a “memex,” where documents would be connected, and could be recalled, by information breadcrumbs called “trails of association.” The timeline continued to the work of Douglas Engelbart, whose team at the Stanford Research Institute devised a linked document system that lived behind a dazzling interface that introduced the metaphors of windows and files to the digital desktop. Then came a detour to the brilliant but erratic work of an autodidact named Ted Nelson, whose ambitious Xanadu Project (though never completed) was a vision of disparate information linked by “hypertext” connections. Nelson’s work inspired Bill Atkinson, a software engineer who had been part of the original Macintosh team; in 1987 he came up with a link-based system called HyperCard, which he sold to Apple for $100,000 on the condition that the company give it away to all its users. But to really fulfill Vannevar Bush’s vision, you needed a huge system where people could freely post and link their documents.
By the time Berners-Lee had his epiphany, that system was in place: the Internet. While the earliest websites were just ways to distribute academic papers more efficiently, soon people began writing sites with information of all sorts, and others created sites just for fun. By the mid-1990s, people were starting to use the web for profit, and a new word, “e-commerce,” found its way into the lexicon. Amazon.com and eBay became Internet giants. Other sites positioned themselves as gateways, or portals, to the wonders of the Internet.
As the web grew, its linking structure accumulated a mind-boggling value. It treated the aggregate of all its contents as a huge compost of ideas, any one of which could be reached by the act of connecting one document to another. When you looked at a page you could see, usually highlighted in blue, the pointers to other sites that the webmaster had coded on the page—that was the hypertext idea that galvanized Bush, Nelson, and Atkinson. But for the first time, as Berners-Lee had intended, the web was coaxing a critical mass of these linked sites and documents into a single network. In effect, the web was an infinite database, a sort of crazily expanding universe of human knowledge that, in theory, could hold every insight, thought, image, and product for sale. And all of it had an intricate lattice of cross-connections created by the independent linking activity of anyone who had built a pa
ge and coded in a link to something elsewhere on the web.
In retrospect, the web was to the digital world what the Louisiana Purchase was to the young United States: the opportunity of a century.
Berners-Lee’s creation was so new that when Stanford got funding from the National Science Foundation in the early 1990s to start a program called the Digital Library Project, the web wasn’t mentioned in the proposal. “The theme of that project was interoperability—how can we make all these resources work together?” recalls Hector Garcia-Molina, who cofounded the project. By 1995 though, Garcia-Molina knew that the World Wide Web would inevitably be part of the projects concocted by the students who worked with the program, including Page and Brin.
Brin already had a National Science Foundation fellowship and didn’t need funding, but he was trying to figure out a dissertation topic. His loose focus was data mining, and with Rajeev Motwani, a young professor he became close with, he helped start a research group called MIDAS, which stood for Mining Data at Stanford. In a résumé he posted on the Stanford site in 1995, he talked about “a new project” to generate personalized movie ratings. “The way it works is as follows,” he wrote. “You rate the movies you have seen. Then the system finds other users with similar tastes to extrapolate how much you like other movies.” Another project he worked on with Garcia-Molina and another student was a system that detected copyright violations by automating searches for duplicates of documents. “He came up with some good algorithms for detecting copies,” says Garcia-Molina. “Now you use Google.”
Page was also seeking a dissertation topic. One idea he presented to Winograd, a collaboration with Brin, seemed more promising than the others: creating a system where people could make annotations and comments on websites. But the more Page thought about annotation, the messier it got. For big sites, there would probably be a lot of people who wanted to mark up a page. How would you figure out who gets to comment or whose comment would be the one you’d see first? For that, he says, “We needed a rating system.”