Page 38 of Enlightenment Now


  Now take the small number of brilliant weaponeers and cut it down still further by the proportion with the cunning and luck to outsmart the world’s police, security experts, and counterterrorism forces. The number may not be zero, but it surely isn’t high. As with all complex undertakings, many heads are better than one, and an organization of bio- or cyberterrorists could be more effective than a lone mastermind. But that’s where Kelly’s observation kicks in: the leader would have to recruit and manage a team of co-conspirators who exercised perfect secrecy, competence, and loyalty to the depraved cause. As the size of the team increases, so do the odds of detection, betrayal, infiltrators, blunders, and stings.49

  Serious threats to the integrity of a country’s infrastructure are likely to require the resources of a state.50 Software hacking is not enough; the hacker needs detailed knowledge about the physical construction of the systems he hopes to sabotage. When the Iranian nuclear centrifuges were compromised in 2010 by the Stuxnet worm, it required a coordinated effort by two technologically sophisticated nations, the United States and Israel. State-based cyber-sabotage escalates the malevolence from terrorism to a kind of warfare, where the constraints of international relations, such as norms, treaties, sanctions, retaliation, and military deterrence, inhibit aggressive attacks, as they do in conventional “kinetic” warfare. As we saw in chapter 11, these constraints have become increasingly effective at preventing interstate war.

  Nonetheless, American military officials have warned of a “digital Pearl Harbor” and a “Cyber-Armageddon” in which foreign states or sophisticated terrorist organizations would hack into American sites to crash planes, open floodgates, melt down nuclear power plants, black out power grids, and take down the financial system. Most cybersecurity experts consider the threats to be inflated—a pretext for more military funding, power, and restrictions on Internet privacy and freedom.51 The reality is that so far, not a single person has ever been injured by a cyberattack. The strikes have mostly been nuisances such as doxing, namely leaking confidential documents or e-mail (as in the Russian meddling in the 2016 American election), and distributed denial-of-service attacks, where a botnet (an array of hacked computers) floods a site with traffic. Schneier explains, “A real-world comparison might be if an army invaded a country, then all got in line in front of people at the Department of Motor Vehicles so they couldn’t renew their licenses. If that’s what war looks like in the 21st century, we have little to fear.”52

  For the techno-doomsters, though, tiny probabilities are no comfort. All it will take, they say, is for one hacker or terrorist or rogue state to get lucky, and it’s game over. That’s why the word threat is preceded with existential, giving the adjective its biggest workout since the heyday of Sartre and Camus. In 2001 the chairman of the Joint Chiefs of Staff warned that “the biggest existential threat out there is cyber” (prompting John Mueller to comment, “As opposed to small existential threats, presumably”).

  This existentialism depends on a casual slide from nuisance to adversity to tragedy to disaster to annihilation. Suppose there was an episode of bioterror or bioterror that killed a million people. Suppose a hacker did manage to take down the Internet. Would the country literally cease to exist? Would civilization collapse? Would the human species go extinct? A little proportion, please—even Hiroshima continues to exist! The assumption is that modern people are so helpless that if the Internet ever went down, farmers would stand by and watch their crops rot while dazed city-dwellers starved. But disaster sociology (yes, there is such a field) has shown that people are highly resilient in the face of catastrophe.53 Far from looting, panicking, or sinking into paralysis, they spontaneously cooperate to restore order and improvise networks for distributing goods and services. Enrico Quarantelli noted that within minutes of the Hiroshima nuclear blast,

  survivors engaged in search and rescue, helped one another in whatever ways they could, and withdrew in controlled flight from burning areas. Within a day, apart from the planning undertaken by the government and military organizations that partly survived, other groups partially restored electric power to some areas, a steel company with 20 percent of workers attending began operations again, employees of the 12 banks in Hiroshima assembled in the Hiroshima branch in the city and began making payments, and trolley lines leading into the city were completely cleared with partial traffic restored the following day.54

  One reason that the death toll of World War II was so horrendous is that war planners on both sides adopted the strategy of bombing civilians until their societies collapsed—which they never did.55 And no, this resilience was not a relic of the homogeneous communities of yesteryear. Cosmopolitan 21st-century societies can cope with disasters, too, as we saw in the orderly evacuation of Lower Manhattan following the 9/11 attacks in the United States, and the absence of panic in Estonia in 2007 when the country was struck with a devastating denial-of-service cyberattack.56

  Bioterrorism may be another phantom menace. Biological weapons, renounced in a 1972 international convention by virtually every nation, have played no role in modern warfare. The ban was driven by a widespread revulsion at the very idea, but the world’s militaries needed little convincing, because tiny living things make lousy weapons. They easily blow back and infect the weaponeers, warriors, and citizens of the side that uses them (just imagine the Tsarnaev brothers with anthrax spores). And whether a disease outbreak fizzles out or (literally) goes viral depends on intricate network dynamics that even the best epidemiologists cannot predict.57

  Biological agents are particularly ill-suited to terrorists, whose goal, recall, is not damage but theater (chapter 13).58 The biologist Paul Ewald notes that natural selection among pathogens works against the terrorist’s goal of sudden and spectacular devastation.59 Germs that depend on rapid person-to-person contagion, like the common-cold virus, are selected to keep their hosts alive and ambulatory so they can shake hands with and sneeze on as many people as possible. Germs get greedy and kill their hosts only if they have some other way of getting from body to body, like mosquitoes (for malaria), a contaminable water supply (for cholera), or trenches packed with injured soldiers (for the 1918 Spanish flu). Sexually transmitted pathogens, like HIV and syphilis, are somewhere in between, needing a long and symptomless incubation period during which hosts can infect their partners, after which the germs do their damage. Virulence and contagion thus trade off, and the evolution of germs will frustrate the terrorist’s aspiration to launch a headline-worthy epidemic that is both swift and lethal. Theoretically, a bioterrorist could try to bend the curve with a pathogen that is virulent, contagious, and durable enough to survive outside bodies. But breeding such a fine-tuned germ would require Nazi-like experiments on living humans that even terrorists (to say nothing of teenagers) are unlikely to carry off. It may be more than just luck that the world so far has seen just one successful bioterror attack (the 1984 tainting of salad with salmonella in an Oregon town by the Rajneeshee religious cult, which killed no one) and one spree killing (the 2001 anthrax mailings, which killed five).60

  To be sure, advances in synthetic biology, such as the gene-editing technique CRISPR-Cas9, make it easier to tinker with organisms, including pathogens. But it’s difficult to re-engineer a complex evolved trait by inserting a gene or two, since the effects of any gene are intertwined with the rest of the organism’s genome. Ewald notes, “I don’t think that we are close to understanding how to insert combinations of genetic variants in any given pathogen that act in concert to generate high transmissibility and stably high virulence for humans.”61 The biotech expert Robert Carlson adds that “one of the problems with building any flu virus is that you need to keep your production system (cells or eggs) alive long enough to make a useful quantity of something that is trying to kill that production system. . . . Booting up the resulting virus is still very, very difficult. . . . I would not dismiss this threat completely, but frankly I am much more worrie
d about what Mother Nature is throwing at us all the time.”62

  And crucially, advances in biology work the other way as well: they also make it easier for the good guys (and there are many more of them) to identify pathogens, invent antibiotics that overcome antibiotic resistance, and rapidly develop vaccines.63 An example is the Ebola vaccine, developed in the waning days of the 2014–15 emergency, after public health efforts had capped the toll at twelve thousand deaths rather than the millions that the media had foreseen. Ebola thus joined a list of other falsely predicted pandemics such as Lassa fever, hantavirus, SARS, mad cow disease, bird flu, and swine flu.64 Some of them never had the potential to go pandemic in the first place because they are contracted from animals or food rather than in an exponential tree of person-to-person infections. Others were nipped by medical and public health interventions. Of course no one knows for sure whether an evil genius will someday overcome the world’s defenses and loose a plague upon the world for fun, vengeance, or a sacred cause. But journalistic habits and the Availability and Negativity biases inflate the odds, which is why I have taken Sir Martin up on his bet. By the time you read this you may know who has won.65

  * * *

  Some of the threats to humanity are fanciful or infinitesimal, but one is real: nuclear war.66 The world has more than ten thousand nuclear weapons distributed among nine countries.67 Many are mounted on missiles or loaded in bombers and can be delivered within hours or less to thousands of targets. Each is designed to cause stupendous destruction: a single one could destroy a city, and collectively they could kill hundreds of millions of people by blast, heat, radiation, and radioactive fallout. If India and Pakistan went to war and detonated a hundred of their weapons, twenty million people could be killed right away, and soot from the firestorms could spread through the atmosphere, devastate the ozone layer, and cool the planet for more than a decade, which in turn would slash food production and starve more than a billion people. An all-out exchange between the United States and Russia could cool the Earth by 8°C for years and create a nuclear winter (or at least autumn) that would starve even more.68 Whether or not nuclear war would (as is often asserted) destroy civilization, the species, or the planet, it would be horrific beyond imagining.

  Soon after atom bombs were dropped on Japan, and the United States and the Soviet Union embarked on a nuclear arms race, a new form of historical pessimism took root. In this Promethean narrative, humanity has wrested deadly knowledge from the gods, and, lacking the wisdom to use it responsibly, is doomed to annihilate itself. In one version, it is not just humanity that is fated to follow this tragic arc but any advanced intelligence. That explains why we have never been visited by space aliens, even though the universe must be teeming with them (the so-called Fermi Paradox, after Enrico Fermi, who first wondered about it). Once life originates on a planet, it inevitably progresses to intelligence, civilization, science, nuclear physics, nuclear weapons, and suicidal war, exterminating itself before it can leave its solar system.

  For some intellectuals the invention of nuclear weapons indicts the enterprise of science—indeed, of modernity itself—because the threat of a holocaust cancels out whatever gifts science may have bestowed upon us. The indictment of science seems misplaced, given that since the dawn of the nuclear age, when mainstream scientists were sidelined from nuclear policy, it’s been physical scientists who have waged a vociferous campaign to remind the world of the danger of nuclear war and to urge nations to disarm. Among the illustrious historic figures are Niels Bohr, J. Robert Oppenheimer, Albert Einstein, Isidor Rabi, Leo Szilard, Joseph Rotblat, Harold Urey, C. P. Snow, Victor Weisskopf, Philip Morrison, Herman Feshbach, Henry Kendall, Theodore Taylor, and Carl Sagan. The movement continues among high-profile scientists today, including Stephen Hawking, Michio Kaku, Lawrence Krauss, and Max Tegmark. Scientists have founded the major activist and watchdog organizations, including the Union of Concerned Scientists, the Federation of American Scientists, the Committee for Nuclear Responsibility, the Pugwash Conferences, and the Bulletin of the Atomic Scientists, whose cover shows the famous Doomsday Clock, now set at two and a half minutes to midnight.69

  Physical scientists, unfortunately, often consider themselves experts in political psychology, and many seem to embrace the folk theory that the most effective way to mobilize public opinion is to whip people into a lather of fear and dread. The Doomsday Clock, despite adorning a journal with “Scientists” in its title, does not track objective indicators of nuclear security; rather, it’s a propaganda stunt intended, in the words of its founder, “to preserve civilization by scaring men into rationality.”70 The clock’s minute hand was farther from midnight in 1962, the year of the Cuban Missile Crisis, than it was in the far calmer 2007, in part because the editors, worried that the public had become too complacent, redefined “doomsday” to include climate change.71 And in their campaign to shake people out of their apathy, scientific experts have made some not-so-prescient predictions:

  Only the creation of a world government can prevent the impending self-destruction of mankind.

  —Albert Einstein, 195072

  I have a firm belief that unless we have more serious and sober thought on various aspects of the strategic problem . . . we are not going to reach the year 2000—and maybe not even the year 1965—without a cataclysm.

  —Herman Kahn, 196073

  Within, at the most, ten years, some of those [nuclear] bombs are going off. I am saying this as responsibly as I can. That is the certainty.

  —C. P. Snow, 196174

  I am completely certain—there is not the slightest doubt in my mind—that by the year 2000, you [students] will all be dead.

  —Joseph Weizenbaum, 197675

  They are joined by experts such as the political scientist Hans Morgenthau, a famous exponent of “realism” in international relations, who predicted in 1979:

  In my opinion the world is moving ineluctably towards a third world war—a strategic nuclear war. I do not believe that anything can be done to prevent it.76

  And the journalist Jonathan Schell, whose 1982 bestseller The Fate of the Earth ended as follows:

  One day—and it is hard to believe that it will not be soon—we will make our choice. Either we will sink into the final coma and end it all or, as I trust and believe, we will awaken to the truth of our peril . . . and rise up to cleanse the earth of nuclear weapons.

  This genre of prophecy went out of style when the Cold War ended and humanity had not sunk into the final coma, despite having failed to create a world government or to cleanse the Earth of nuclear weapons. To keep the fear at a boil, activists keep lists of close calls and near-misses intended to show that Armageddon has always been just a glitch away and that humanity has survived only by dint of an uncanny streak of luck.77 The lists tend to lump truly dangerous moments, such as a 1983 NATO exercise that some Soviet officers almost mistook for an imminent first strike, with smaller lapses and snafus, such as a 2013 incident in which an off-duty American general who was responsible for nuclear-armed missiles got drunk and acted boorishly toward women during a four-day trip to Russia.78 The sequence that would escalate to a nuclear exchange is never laid out, nor are alternative assessments given which might put the episodes in context and lessen the terror.79

  The message that many antinuclear activists want to convey is “Any day now we will all die horribly unless the world immediately takes measures which it has absolutely no chance of taking.” The effect on the public is about what you would expect: people avoid thinking about the unthinkable, get on with their lives, and hope the experts are wrong. Mentions of “nuclear war” in books and newspapers have steadily declined since the 1980s, and journalists give far more attention to terrorism, inequality, and sundry gaffes and scandals than they do to a threat to the survival of civilization.80 The world’s leaders are no more moved. Carl Sagan was a coauthor of the first paper warning of a nuclear winte
r, and when he campaigned for a nuclear freeze by trying to generate “fear, then belief, then response,” he was advised by an arms-control expert, “If you think that the mere prospect of the end of the world is sufficient to change thinking in Washington and Moscow you clearly haven’t spent much time in either of those places.”81

  In recent decades predictions of an imminent nuclear catastrophe have shifted from war to terrorism, such as when the American diplomat John Negroponte wrote in 2003, “There is a high probability that within two years al-Qaeda will attempt an attack using a nuclear or other weapon of mass destruction.”82 Though a probabilistic prediction of an event that fails to occur can never be gainsaid, the sheer number of false predictions (Mueller has more than seventy in his collection, with deadlines staggered over several decades) suggests that prognosticators are biased toward scaring people.83 (In 2004, four American political figures wrote an op-ed on the threat of nuclear terrorism entitled “Our Hair Is on Fire.”)84 The tactic is dubious. People are easily riled by actual attacks with guns and homemade bombs into supporting repressive measures like domestic surveillance or a ban on Muslim immigration. But predictions of a mushroom cloud on Main Street have aroused little interest in policies to combat nuclear terrorism, such as an international program to control fissile material.

  Such backfiring had been predicted by critics of the first nuclear scare campaigns. As early as 1945, the theologian Reinhold Niebuhr observed, “Ultimate perils, however great, have a less lively influence upon the human imagination than immediate resentments and frictions, however small by comparison.”85 The historian Paul Boyer found that nuclear alarmism actually encouraged the arms race by scaring the nation into pursuing more and bigger bombs, the better to deter the Soviets.86 Even the originator of the Doomsday Clock, Eugene Rabinowitch, came to regret his movement’s strategy: “While trying to frighten men into rationality, scientists have frightened many into abject fear or blind hatred.”87