The really exciting part (or the scary part, if your vision of the future is more like the movie The Terminator) is that, once the intelligence explosion happens, we’ll get an AI that is as superior to us at science, politics, invention, and social skills as your computer’s calculator is to you at arithmetic. The problems that have occupied mankind for decades—curing diseases, finding better energy sources, etc.—could, in many cases, be solved in a matter of weeks or months.

  * * *

  The Last Invention

  “Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.”

  —I. J. GOOD, STATISTICIAN WHO HELPED ALAN TURING CRACK THE ENIGMA CODE DURING WORLD WAR II

  * * *

  Again, this might sound far-fetched, but Ray Kurzweil isn’t the only one who thinks an intelligence explosion could occur sometime this century. Justin Rattner, the chief technology officer at Intel, predicts some kind of Singularity by 2048. Michael Nielsen, co-author of the leading textbook on quantum computation, thinks there’s a decent chance of an intelligence explosion by 2100. Richard Sutton, one of the biggest names in AI, predicts an intelligence explosion near the middle of the century. Leading philosopher David Chalmers is 50 percent confident an intelligence explosion will occur by 2100. Participants at a 2009 conference on AI tended to be 50 percent confident that an intelligence explosion would occur by 2045.

  If we can properly prepare for the intelligence explosion and ensure that it goes well for humanity, it could be the best thing that has ever happened on this fragile planet. Consider the difference between humans and chimpanzees, which share 95 percent of their genetic code. A relatively small difference in intelligence gave humans the ability to invent farming, writing, science, democracy, capitalism, birth control, vaccines, space travel, and iPhones—all while chimpanzees kept flinging poo at each other.

  * * *

  Intelligent Design?

  The thought that machines could one day have superhuman abilities should make us nervous. Once the machines are smarter and more capable than we are, we won’t be able to negotiate with them any more than chimpanzees can negotiate with us. What if the machines don’t want the same things we do?

  The truth, unfortunately, is that every kind of AI we know how to build today definitely would not want the same things we do. To build an AI that does, we would need a more flexible “decision theory” for AI design and new techniques for making sense of human preferences. I know that sounds kind of nerdy, but AIs are made of math and so math is really important for choosing which results you get from building an AI.

  These are the kinds of research problems being tackled by the Singularity Institute in America and the Future of Humanity Institute in Great Britain. Unfortunately, our silly species still spends more money each year on lipstick research than we do on figuring out how to make sure that the most important event of this century (maybe of all human history)—the intelligence explosion—actually goes well for us.

  * * *

  Likewise, self-improving machines could perform scientific experiments and build new technologies much faster and more intelligently than humans can. Curing cancer, finding clean energy, and extending life expectancies would be child’s play for them. Imagine living out your own personal fantasy in a different virtual world every day. Imagine exploring the galaxy at near light speed, with a few backup copies of your mind safe at home on earth in case you run into an exploding supernova. Imagine a world where resources are harvested so efficiently that everyone’s basic needs are taken care of, and political and economic incentives are so intelligently fine-tuned that “world peace” becomes, for the first time ever, more than a Super Bowl halftime show slogan.

  With self-improving AI we may be able to eradicate suffering and death just as we once eradicated smallpox. It is not the limits of nature that prevent us from doing this, but only the limits of our current understanding. It may sound like a paradox, but it’s our brains that prevent us from fully understanding our brains.

  TURF WARS

  At this point you might be asking yourself: “Why is this topic in this book? What does any of this have to do with the economy or national security or politics?”

  In fact, it has everything to do with all of those issues, plus a whole lot more. The intelligence explosion will bring about change on a scale and scope not seen in the history of the world. If we don’t prepare for it, things could get very bad, very fast. But if we do prepare for it, the intelligence explosion could be the best thing that has happened since . . . literally ever.

  * * *

  Beyond Bad Sci-Fi Movies

  The “event horizon” of a black hole is the boundary where gravity becomes so strong that you’ve reached “the point of no return.” No rocket in the world would ever be powerful enough to blast you out of the black hole past that point. The Singularity is kind of like that: intelligence explosion is a kind of event horizon because from that moment forward, everything changes and we can’t go back. Once the machines are stronger than us, what they want is what happens. So we’d better be careful about precisely specifying (in their code) what they want, before they improve themselves beyond our ability to control them.

  * * *

  But before we get to the kind of life-altering progress that would come after the Singularity, we will first have to deal with a lot of smaller changes, many of which will throw entire industries and ways of life into turmoil. Take the music business, for example. It was not long ago that stores like Tower Records and Sam Goody were doing billions of dollars a year in compact disc sales; now people buy music from home via the Internet. Publishing is currently facing a similar upheaval. Newspapers and magazines have struggled to keep subscribers, booksellers like Borders have been forced into bankruptcy, and customers are forcing publishers to switch to ebooks faster than the publishers might like.

  All of this is to say that some people are already witnessing the early stages of upheaval firsthand. But for everyone else, there is still a feeling that something is different this time; that all of those years of education and experience might be turned upside down in an instant. They might not be able to identify it exactly but they realize that the world they’ve known for forty, fifty, or sixty years is no longer the same.

  There’s a good reason for that. We feel it and sense it because it’s true. It’s happening. There’s absolutely no question that the world in 2030 will be a very different place than the one we live in today. But there is a question, a large one, about whether that place will be better or worse.

  It’s human nature to resist change. We worry about our families, our careers, and our bank accounts. The executives in industries that are already experiencing cataclysmic shifts would much prefer to go back to the way things were ten years ago, when people still bought music, magazines, and books in stores. The future was predictable. Humans like that; it’s part of our nature.

  But predictability is no longer an option. The intelligence explosion, when it comes in earnest, is going to change everything—we can either be prepared for it and take advantage of it, or we can resist it and get run over.

  Unfortunately, there are a good number of people who are going to resist it. Not only those in affected industries, but those who hold power at all levels. They see how technology is cutting out the middlemen, how people are becoming empowered, how bloggers can break national news and YouTube videos can create superstars.

  And they don’t like it.

  * * *

  Reality Check

  President Obama was recently asked about unemployment in America and he responded by explainin
g that automation was impacting the hiring of new workers. “There are some structural issues with our economy where a lot of businesses have learned to become much more efficient with a lot fewer workers. You see it when you go to a bank and you use an ATM, you don’t go to a bank teller, or you go to the airport and you’re using a kiosk instead of checking in at the gate.”

  Here’s the problem: That’s not a “structural issue,” that’s reality. (In fact, economists have a name for this strange phenomenon that creates jobs and helps the economy grow: “productivity.”) You either adapt or you die. ATMs and self-service kiosks aren’t the problem—who wouldn’t rather go to an ATM than stand in line for a teller? The problem is that our economy is not yet flexible enough to adapt to these changes. If that is the structural issue that Obama was talking about then he’s exactly right—and if we don’t fix it soon, we’ll be begging for the days of 9 percent unemployment.

  * * *

  A BATTLE FOR THE FUTURE

  Power bases in business and politics that have been forged over decades, if not centuries, are being threatened with extinction, and they know it. So the owners of that power are trying to hold on. They think they can do that by dragging us backward. They think that, by growing the public’s dependency on government, by taking away the entrepreneurial spirit and rewards and by limiting personal freedoms, they can slow down progress.

  But they’re wrong. The intelligence explosion is coming so long as science itself continues. Trying to put the genie back in the bottle by dragging us toward serfdom won’t stop it and will, in fact, only leave the world with an economy and society that are completely unprepared for the amazing things that it could bring.

  Robin Hanson, author of “The Economics of the Singularity” and an associate professor of economics at George Mason University, wrote that after the Singularity, “The world economy, which now doubles in 15 years or so, would soon double in somewhere from a week to a month.”

  * * *

  The Progress of Progress

  One of the keys to the Singularity (and to most technological progress in general) is that everything will happen in due time. We can embrace and influence progress, but we can’t mandate it. Take solar energy, for example. The idea of the sun being a clean energy source for the world is amazing—no one disputes that. But so is the idea of flying cars. In neither case has the technology caught up to the reality. Yet, in the case of solar energy, the government has decided that it cannot wait any longer. So, it’s provided investments and subsidies and mandates—none of which do much of anything to help the technology itself get better faster.

  Looking at the rate of growth, Kurzweil believes that solar power will be ready for mass use in about sixteen years. He writes, “Solar panels are coming down dramatically in cost per watt. And as a result of that, the total amount of solar energy is growing, not linearly, but exponentially. It’s doubling every 2 years and has been for 20 years. And again, it’s a very smooth curve. There’s all these arguments, subsidies and political battles and companies going bankrupt, they’re raising billions of dollars, but behind all that chaos is this very smooth progression.”

  The lesson? Interference by governments in technological progress may be good for winning elections (and certainly, in some cases, the right kind of funding can help speed up innovation), but it does nothing to change the underlying growth curve.

  * * *

  That is unfathomable. But even if the rate were much slower, say a doubling of the world economy in two years, the shock-waves from that kind of growth would still change everything we’ve come to know and rely on. A machine could offer the ideal farming methods to double or triple crop production, but it can’t force a farmer or an industry to implement them. A machine could find the cure for cancer, but it would be meaningless if the pharmaceutical industry or Food and Drug Administration refused to allow it. The machines won’t be the problem; humans will be.

  And that’s why I wanted to write about this topic. We are at the forefront of something great, something that will make the Industrial Revolution look in comparison like a child discovering his hands. But we have to be prepared. We must be open to the changes that will come, because they will come. Only when we accept that will we be in a position to thrive. We can’t allow politicians to blame progress for our problems. We can’t allow entrenched bureaucrats and power-hungry executives to influence a future that they may have no place in.

  Many people are afraid of these changes—of course they are: it’s part of being human to fear the unknown—but we can’t be so entrenched in the way the world works now that we are unable to handle change out of fear for what those changes might bring.

  Change is going to be as much a part of our future as it has been of our past. Yes, it will happen faster and the changes themselves will be far more dramatic, but if we prepare for it, the change will mostly be positive. But that preparation is the key: we need to become more well-rounded as individuals so that we’re able to constantly adapt to new ways of doing things. In the future, the way you do your job may change four to five or fifty times over the course of your life. Those who cannot, or will not, adapt will be left behind.

  At the same time, the Singularity will give many more people the opportunity to be successful. Because things will change so rapidly there is a much greater likelihood that people will find something they excel at. But it could also mean that people’s successes are much shorter-lived. The days of someone becoming a legend in any one business (think Clive Davis in music, Steven Spielberg in movies, or the Hearst family in publishing) are likely over. But those who embrace and adapt to the coming changes, and surround themselves with others who have done the same, will flourish.

  When major companies, set in their ways, try to convince us that change is bad and that we must stick to the status quo, no matter how much human inquisitiveness and ingenuity try to propel us forward, we must look past them. We must know in our hearts that these changes will come, and that if we welcome them into our world, we’ll become more successful, more free, and more full of light than we could have ever possibly imagined.

  Ray Kurzweil once wrote, “The Singularity is near.” The only question will be whether we are ready for it.

  Chapter 1

  STEP RIGHT UP!

  The Progressive Shell Game

  PAGE 2: “‘the differences between ourselves and our opponents’” Newsmax, The Greatest Speeches of Ronald Reagan (West Palm Beach, FL: Newsmax.com, 2002), 41.

  PAGE 3: “member of the Senate for multiple years” “How Liberal is John Kerry?” factcheck.org, October 19, 2004, http://www.factcheck.org/how_liberal_is_john_kerry.html.

  PAGE 4: “‘era at the beginning of the 20th century’” “Part I: CNN/YouTube Democratic Presidential Debate Transcript,” cnn.com, July 23, 2007, http://articles.cnn.com/2007-07-23/politics/debate.transcript_1_new-ideas-issues-don-t-matter-child-care-legislation.

  PAGE 6: “‘gaining represents benefit to the community’” Ronald J. Pestritto and William J. Atto, eds., American Progressivism: A Reader (Lanham, MD: Rowman & Littlefield, 2008), 217. • “in governmental control is now necessary” Ronald J. Pestritto and William J. Atto, eds., American Progressivism: A Reader (Lanham, MD: Rowman & Littlefield, 2008), 6. • “‘hero is a guy named Teddy Roosevelt’” “The Second Presidential Debate,” New York Times, October 7, 2008, http://elections.nytimes.com/2008/president/debates/transcripts/second-presidential-debate.html. • “‘lean in the regulatory leaning is okay.’” Glenn Beck, “Glenn’s revealing interview with Newt Gingrich – Story and Video,” GlennBeck.com video, 9:27, December 6, 2011, http://www.glennbeck.com/2011/12/06/glenns-revealing-interview-with-newt-gingrich-story-and-video. • “‘fellow progressives of 1791 and ’92’” John Milton Cooper Jr., Pivotal Decades: The United States, 1900–1920 (New York: Norton, 1990), 170; Edmund Morris, Colonel Roosevelt (New York: Random House, 2010), 632. • “Progressive who hated Japanese immigrants” Greg Robinson, By Order o
f the President: FDR and the Internment of Japanese Americans (Cambridge, MA: Harvard University Press, 2001), 18, 22–23, 117. • “in his state into internment camps” G. Edward White, Earl Warren: A Public Life (New York: Oxford University Press, 1982), 67–78.

  PAGE 7: “anti-Semitic congressman John Rankin” William E. Leuchtenburg, The FDR Years: On Roosevelt and His Legacy (New York: Columbia University Press, 1995), 183. • “‘than most any other foreign nation.’” Robert David Johnson, The Peace Progressives and American Foreign Relations (Cambridge, MA: Harvard University Press, 1995), 276. • “Republican George Norris from the rafters” Joseph C. Goulden, ed., Mencken’s Last Campaign: H. L. Mencken and the 1948 Election (Washington: New Republic Book, 1976), 73. • “delivering a speech in Philadelphia that year” Mark Sullivan, Our Times, vol. 4 (New York: Charles Scribner’s Sons, 1939), 472–74. • “from the Socialist Party of America” Mel Van Elteren, Labor and the American Left: An Analytical History (Jefferson, NC: McFarland., 2011), 67. • “‘on the ruins of public liberty’” “Washington’s Farewell Address 1796,” avalon.law.yale.edu, Yale Law School, Lillian Goldman Law Library, accessed April 18, 2012, http://avalon.law.yale.edu/18th_century/washing.asp.

  PAGE 8: “‘feebleminded and criminal children of weaklings’” Madison Grant, The Passing of the Great Race (New York: Charles Scribner’s Sons, 1916), 45. • “‘The book is my Bible’” Jonathan Peter Spiro, Defending the Master Race: Conservation, Eugenics, and the Legacy of Madison Grant (Lebanon, NH: University of Vermont Press, 2009), 357.