Page 55 of The Innovators


  None of this is new. Babbage got most of his funding from the British government, which was generous in financing research that could strengthen its economy and empire. He adopted ideas from private industry, most notably the punch cards that had been developed by the textile firms for automated looms. He and his friends were founders of a handful of new peer-network clubs, including the British Association for the Advancement of Science, and though it may seem a stretch to view that august group as a fancy-dress forerunner to the Homebrew Computer Club, both existed to facilitate commons-based peer collaboration and the sharing of ideas.

  * * *

  The most successful endeavors in the digital age were those run by leaders who fostered collaboration while also providing a clear vision. Too often these are seen as conflicting traits: a leader is either very inclusive or a passionate visionary. But the best leaders could be both. Robert Noyce was a good example. He and Gordon Moore drove Intel forward based on a sharp vision of where semiconductor technology was heading, and they both were collegial and nonauthoritarian to a fault. Even Steve Jobs and Bill Gates, with all of their prickly intensity, knew how to build strong teams around them and inspire loyalty.

  Brilliant individuals who could not collaborate tended to fail. Shockley Semiconductor disintegrated. Similarly, collaborative groups that lacked passionate and willful visionaries also failed. After inventing the transistor, Bell Labs went adrift. So did Apple after Jobs was ousted in 1985.

  Most of the successful innovators and entrepreneurs in this book had one thing in common: they were product people. They cared about, and deeply understood, the engineering and design. They were not primarily marketers or salesmen or financial types; when such folks took over companies, it was often to the detriment of sustained innovation. “When the sales guys run the company, the product guys don’t matter so much, and a lot of them just turn off,” Jobs said. Larry Page felt the same: “The best leaders are those with the deepest understanding of the engineering and product design.”34

  Another lesson of the digital age is as old as Aristotle: “Man is a social animal.” What else could explain CB and ham radios or their successors, such as WhatsApp and Twitter? Almost every digital tool, whether designed for it or not, was commandeered by humans for a social purpose: to create communities, facilitate communication, collaborate on projects, and enable social networking. Even the personal computer, which was originally embraced as a tool for individual creativity, inevitably led to the rise of modems, online services, and eventually Facebook, Flickr, and Foursquare.

  Machines, by contrast, are not social animals. They don’t join Facebook of their own volition nor seek companionship for its own sake. When Alan Turing asserted that machines would someday behave like humans, his critics countered that they would never be able to show affection or crave intimacy. To indulge Turing, perhaps we could program a machine to feign affection and pretend to seek intimacy, just as humans sometimes do. But Turing, more than almost anyone, would probably know the difference.

  According to the second part of Aristotle’s quote, the nonsocial nature of computers suggests that they are “either a beast or a god.” Actually, they are neither. Despite all of the proclamations of artificial intelligence engineers and Internet sociologists, digital tools have no personalities, intentions, or desires. They are what we make of them.

  ADA’S LASTING LESSON: POETICAL SCIENCE

  That leads to a final lesson, one that takes us back to Ada Lovelace. As she pointed out, in our symbiosis with machines we humans have brought one crucial element to the partnership: creativity. The history of the digital age—from Bush to Licklider to Engelbart to Jobs, from SAGE to Google to Wikipedia to Watson—has reinforced this idea. And as long as we remain a creative species, this is likely to hold true. “The machines will be more rational and analytic,” IBM’s research director John Kelly says. “People will provide judgment, intuition, empathy, a moral compass, and human creativity.”35

  We humans can remain relevant in an era of cognitive computing because we are able to think different, something that an algorithm, almost by definition, can’t master. We possess an imagination that, as Ada said, “brings together things, facts, ideas, conceptions in new, original, endless, ever-varying combinations.” We discern patterns and appreciate their beauty. We weave information into narratives. We are storytelling as well as social animals.

  Human creativity involves values, intentions, aesthetic judgments, emotions, personal consciousness, and a moral sense. These are what the arts and humanities teach us—and why those realms are as valuable a part of education as science, technology, engineering, and math. If we mortals are to uphold our end of the human-computer symbiosis, if we are to retain a role as the creative partners of our machines, we must continue to nurture the wellsprings of our imagination and originality and humanity. That is what we bring to the party.

  At his product launches, Steve Jobs would conclude with a slide, projected on the screen behind him, of street signs showing the intersection of the Liberal Arts and Technology. At his last such appearance, for the iPad 2 in 2011, he stood in front of that image and declared, “It’s in Apple’s DNA that technology alone is not enough—that it’s technology married with liberal arts, married with the humanities, that yields us the result that makes our heart sing.” That’s what made him the most creative technology innovator of our era.

  The converse to this paean to the humanities, however, is also true. People who love the arts and humanities should endeavor to appreciate the beauties of math and physics, just as Ada did. Otherwise, they will be left as bystanders at the intersection of arts and science, where most digital-age creativity will occur. They will surrender control of that territory to the engineers.

  Many people who celebrate the arts and the humanities, who applaud vigorously the tributes to their importance in our schools, will proclaim without shame (and sometimes even joke) that they don’t understand math or physics. They extoll the virtues of learning Latin, but they are clueless about how to write an algorithm or tell BASIC from C++, Python from Pascal. They consider people who don’t know Hamlet from Macbeth to be Philistines, yet they might merrily admit that they don’t know the difference between a gene and a chromosome, or a transistor and a capacitor, or an integral and a differential equation. These concepts may seem difficult. Yes, but so, too, is Hamlet. And like Hamlet, each of these concepts is beautiful. Like an elegant mathematical equation, they are expressions of the glories of the universe.

  C. P. Snow was right about the need to respect both of “the two cultures,” science and the humanities. But even more important today is understanding how they intersect. Those who helped lead the technology revolution were people in the tradition of Ada, who could combine science and the humanities. From her father came a poetic streak and from her mother a mathematical one, and it instilled in her a love for what she called “poetical science.” Her father defended the Luddites who smashed mechanical looms, but Ada loved how punch cards instructed those looms to weave beautiful patterns, and she envisioned how this wondrous combination of art and technology could be manifest in computers.

  The next phase of the Digital Revolution will bring even more new methods of marrying technology with the creative industries, such as media, fashion, music, entertainment, education, literature, and the arts. Much of the first round of innovation involved pouring old wine—books, newspapers, opinion pieces, journals, songs, television shows, movies—into new digital bottles. But new platforms, services, and social networks are increasingly enabling fresh opportunities for individual imagination and collaborative creativity. Role-playing games and interactive plays are merging with collaborative forms of storytelling and augmented realities. This interplay between technology and the arts will eventually result in completely new forms of expression and formats of media.

  This innovation will come from people who are able to link beauty to engineering, humanity to technology, and poetry to pr
ocessors. In other words, it will come from the spiritual heirs of Ada Lovelace, creators who can flourish where the arts intersect with the sciences and who have a rebellious sense of wonder that opens them to the beauty of both.

  * * *

  I. A neuron is a nerve cell that transmits information using electrical or chemical signals. A synapse is a structure or pathway that carries a signal from a neuron to another neuron or cell.

  ACKNOWLEDGMENTS

  I want to thank the people who gave me interviews and provided information, including Bob Albrecht, Al Alcorn, Marc Andreessen, Tim Berners-Lee, Stewart Brand, Dan Bricklin, Larry Brilliant, John Seeley Brown, Nolan Bushnell, Jean Case, Steve Case, Vint Cerf, Wes Clark, Steve Crocker, Lee Felsenstein, Bob Frankston, Bob Kahn, Alan Kay, Bill Gates, Al Gore, Andy Grove, Justin Hall, Bill Joy, Jim Kimsey, Leonard Kleinrock, Tracy Licklider, Liza Loop, David McQueeney, Gordon Moore, John Negroponte, Larry Page, Howard Rheingold, Larry Roberts, Arthur Rock, Virginia Rometty, Ben Rosen, Steve Russell, Eric Schmidt, Bob Taylor, Paul Terrell, Jimmy Wales, Evan Williams, and Steve Wozniak. I’m also grateful to people who gave useful advice along the way, including Ken Auletta, Larry Cohen, David Derbes, John Doerr, John Hollar, John Markoff, Lynda Resnick, Joe Zeff, and Michael Moritz.

  Rahul Mehta at the University of Chicago and Danny Z. Wilson at Harvard read an early draft to fix any math or engineering mistakes; no doubt I snuck a few in when they weren’t looking, so they shouldn’t be blamed for any lapses. I’m particularly grateful to Strobe Talbott, who read and made extensive comments on a draft. He has done the same for each book I’ve written, going back to The Wise Men in 1986, and I’ve kept every set of his detailed notes as a testament to his wisdom and generosity.

  I also tried something different for this book: crowdsourcing suggestions and corrections on many of the chapters. This isn’t a new thing. Sending around papers for comments is one reason why the Royal Society was created in London in 1660 and why Benjamin Franklin founded the American Philosophical Society. At Time magazine, we had a practice of sending story drafts to all bureaus for their “comments and corrections,” which was very useful. In the past, I’ve sent parts of my drafts to dozens of people I knew. By using the Internet, I could solicit comments and corrections from thousands of people I didn’t know.

  This seemed fitting, because facilitating the collaborative process was one reason the Internet was created. One night when I was writing about that, I realized that I should try using the Internet for this original purpose. It would, I hoped, both improve my drafts and allow me to understand better how today’s Internet-based tools (compared to Usenet and the old bulletin board systems) facilitate collaboration.

  I experimented on many sites. The best, it turned out, was Medium, which was invented by Ev Williams, a character in this book. One excerpt was read by 18,200 people in its first week online. That’s approximately 18,170 more draft readers than I’ve ever had in the past. Scores of readers posted comments, and hundreds sent me emails. This led to many changes and additions as well as an entirely new section (on Dan Bricklin and VisiCalc). I want to thank the hundreds of collaborators, some of whom I have now gotten to know, who helped me in this crowdsourcing process. (Speaking of which, I hope that someone will soon invent a cross between an enhanced eBook and a wiki so that new forms of multimedia history can emerge that are partly author-guided and partly crowdsourced.)

  I also want to thank Alice Mayhew and Amanda Urban, who have been my editor and agent for thirty years, and the team at Simon & Schuster: Carolyn Reidy, Jonathan Karp, Jonathan Cox, Julia Prosser, Jackie Seow, Irene Kheradi, Judith Hoover, Ruth Lee-Mui, and Jonathan Evans. At the Aspen Institute, I am indebted to Pat Zindulka and Leah Bitounis, among many others. I’m also lucky to have three generations of my family willing to read and comment on a draft of this book: my father, Irwin (an electrical engineer); my brother, Lee (a computer consultant); and my daughter, Betsy (a tech writer, who first turned me on to Ada Lovelace). Most of all, I am grateful to my wife, Cathy, the wisest reader and most loving person I’ve ever known.

  ABOUT THE AUTHOR

  Walter Isaacson, the CEO of the Aspen Institute, has been the chairman of CNN and the managing editor of Time magazine. He is the author of Steve Jobs; Einstein: His Life and Universe; Benjamin Franklin: An American Life; and Kissinger: A Biography, and is the coauthor, with Evan Thomas, of The Wise Men: Six Friends and the World They Made. He and his wife live in Washington, DC.

  MEET THE AUTHORS, WATCH VIDEOS AND MORE AT

  SimonandSchuster.com

  authors.simonandschuster.com/Walter-Isaacson

  ALSO BY WALTER ISAACSON

  Steve Jobs

  American Sketches

  Einstein: His Life and Universe

  A Benjamin Franklin Reader

  Benjamin Franklin: An American Life

  Kissinger: A Biography

  The Wise Men: Six Friends and the World They Made (with Evan Thomas)

  Pro and Con

  We hope you enjoyed reading this Simon & Schuster eBook.

  * * *

  Join our mailing list and get updates on new releases, deals, bonus content and other great books from Simon & Schuster.

  CLICK HERE TO SIGN UP

  or visit us online to sign up at

  eBookNews.SimonandSchuster.com

  NOTES

  INTRODUCTION

  1. Henry Kissinger, background briefing for reporters, Jan. 15, 1974, from file in Time magazine archives.

  2. Steven Shapin, The Scientific Revolution (University of Chicago Press, 1996), 1, 5.

  CHAPTER ONE: ADA, COUNTESS OF LOVELACE

  1. Lady Byron to Mary King, May 13, 1833. The Byron family letters, including those of Ada, are in the Bodleian Library, Oxford. Transcriptions of Ada’s are in Betty Toole, Ada, the Enchantress of Numbers: A Selection from the Letters (Strawberry, 1992) and in Doris Langley Moore, Ada, Countess of Lovelace (John Murray, 1977). In addition to sources cited below, this section also draws on Joan Baum, The Calculating Passion of Ada Byron (Archon, 1986); William Gibson and Bruce Sterling, The Difference Engine (Bantam, 1991); Dorothy Stein, Ada (MIT Press, 1985); Doron Swade, The Difference Engine (Viking, 2001); Betty Toole, Ada: Prophet of the Computer Age (Strawberry, 1998); Benjamin Woolley, The Bride of Science (Macmillan, 1999); Jeremy Bernstein, The Analytical Engine (Morrow, 1963); James Gleick, The Information (Pantheon, 2011), chapter 4. Unless otherwise noted, quotes from Ada’s letters rely on the Toole transcriptions.

  Writers about Ada Lovelace range from canonizers to debunkers. The most sympathetic books are those by Toole, Woolley, and Baum; the most scholarly and balanced is Stein’s. For a debunking of Ada Lovelace, see Bruce Collier, “The Little Engines That Could’ve,” PhD dissertation, Harvard, 1970, http://robroy.dyndns.info/collier/. He writes, “She was a manic depressive with the most amazing delusions about her talents. . . . Ada was as mad as a hatter, and contributed little more to the ‘Notes’ than trouble.”

  2. Lady Byron to Dr. William King, June 7, 1833.

  3. Richard Holmes, The Age of Wonder (Pantheon, 2008), 450.

  4. Laura Snyder, The Philosophical Breakfast Club (Broadway, 2011), 190.

  5. Charles Babbage, The Ninth Bridgewater Treatise (1837), chapters 2 and 8, http://www.victorianweb.org/science/science_texts/bridgewater/intro.htm; Snyder, The Philosophical Breakfast Club, 192.

  6. Toole, Ada, the Enchantress of Numbers, 51.

  7. Sophia De Morgan, Memoir of Augustus De Morgan (Longmans, 1882), 9; Stein, Ada, 41.

  8. Holmes, The Age of Wonder, xvi.

  9. Ethel Mayne, The Life and Letters of Anne Isabella, Lady Noel Byron (Scribner’s, 1929), 36; Malcolm Elwin, Lord Byron’s Wife (Murray, 1974), 106.

  10. Lord Byron to Lady Melbourne, Sept. 28, 1812, in John Murray, editor, Lord Byron’s Correspondence (Scribner’s, 1922), 88.

  11. Stein, Ada, 14, from Thomas Moore’s biography of Byron based on Byron’s destroyed journals.


  12. Woolley, The Bride of Science, 60.

  13. Stein, Ada, 16; Woolley, The Bride of Science, 72.

  14. Woolley, The Bride of Science, 92.

  15. Woolley, The Bride of Science, 94.

  16. John Galt, The Life of Lord Byron (Colburn and Bentley, 1830), 316.

  17. Ada to Dr. William King, Mar. 9, 1834, Dr. King to Ada, Mar. 15, 1834; Stein, Ada, 42.

  18. Ada to Dr. William King, Sept. 1, 1834; Stein, Ada, 46.

  19. Woolley, The Bride of Science, 172.

  20. Catherine Turney, Byron’s Daughter: A Biography of Elizabeth Medora Leigh (Readers Union, 1975), 160.

  21. Velma Huskey and Harry Huskey, “Lady Lovelace and Charles Babbage,” IEEE Annals of the History of Computing, Oct.–Dec. 1980.

  22. Ada to Charles Babbage, Nov. 1839.