Page 8 of Enlightenment Now


  The African AIDS dip is a reminder that progress is not an escalator that inexorably raises the well-being of every human everywhere all the time. That would be magic, and progress is an outcome not of magic but of problem-solving. Problems are inevitable, and at times particular sectors of humanity have suffered terrible setbacks. In addition to the African AIDS epidemic, longevity went into reverse for young adults worldwide during the Spanish flu pandemic of 1918–19 and for middle-aged, non-college-educated, non-Hispanic white Americans in the early 21st century.6 But problems are solvable, and the fact that longevity continues to increase in every other Western demographic means that solutions to the problems facing this one exist as well.

  Average life spans are stretched the most by decreases in infant and child mortality, both because children are fragile and because the death of a child brings down the average more than the death of a 60-year-old. Figure 5-2 shows what has happened to child mortality since the Age of Enlightenment in five countries that are more or less representative of their continents.

  Look at the numbers on the vertical axis: they refer to the percentage of children who die before reaching the age of 5. Yes, well into the 19th century, in Sweden, one of the world’s wealthiest countries, between a quarter and a third of all children died before their fifth birthday, and in some years the death toll was close to half. This appears to be typical in human history: a fifth of hunter-gatherer children die in their first year, and almost half before they reach adulthood.7 The spikiness in the curve before the 20th century reflects not just noise in the data but the parlous nature of life: an epidemic, war, or famine could bring death to one’s door at any time. Even the well-to-do could be struck by tragedy: Charles Darwin lost two children in infancy and his beloved daughter Annie at the age of 10.

  Figure 5-2: Child mortality, 1751–2013

  Sources: Our World in Data, Roser 2016a, based on data from the UN Child Mortality estimates, http://www.childmortality.org/, and the Human Mortality Database, http://www.mortality.org/.

  Then a remarkable thing happened. The rate of child mortality plunged a hundredfold, to a fraction of a percentage point in developed countries, and the plunge went global. As Deaton observed in 2013, “There is not a single country in the world where infant or child mortality today is not lower than it was in 1950.”8 In sub-Saharan Africa, the child mortality rate has fallen from around one in four in the 1960s to less than one in ten in 2015, and the global rate has fallen from 18 to 4 percent—still too high, but sure to come down if the current thrust to improve global health continues.

  Remember two facts behind the numbers. One is demographic: when fewer children die, parents have fewer children, since they no longer have to hedge their bets against losing their entire families. So contrary to the worry that saving children’s lives would only set off a “population bomb” (a major eco-panic of the 1960s and 1970s, which led to calls for reducing health care in the developing world), the decline in child mortality has defused it.9

  The other is personal. The loss of a child is among the most devastating experiences. Imagine the tragedy; then try to imagine it another million times. That’s a quarter of the number of children who did not die last year alone who would have died had they been born fifteen years earlier. Now repeat, two hundred times or so, for the years since the decline in child mortality began. Graphs like figure 5-2 display a triumph of human well-being whose magnitude the mind cannot begin to comprehend.

  Just as difficult to appreciate is humanity’s impending triumph over another of nature’s cruelties, the death of a mother in childbirth. The God of the Hebrew Bible, ever merciful, told the first woman, “I will multiply your pain in childbearing; in pain you shall bring forth children.” Until recently about one percent of mothers died in the process; for an American woman, being pregnant a century ago was almost as dangerous as having breast cancer today.10 Figure 5-3 shows the trajectory of maternal mortality since 1751 in four countries that are representative of their regions.

  Figure 5-3: Maternal mortality, 1751–2013

  Source: Our World in Data, Roser 2016p, based partly on data from Claudia Hanson of Gapminder, https://www.gapminder.org/data/documentation/gd010/.

  Starting in the late 18th century in Europe, the mortality rate plummeted three hundredfold, from 1.2 to 0.004 percent. The declines have spread to the rest of the world, including the poorest countries, where the death rate has fallen even faster, though for a shorter time because of their later start. The rate for the entire world, after dropping almost in half in just twenty-five years, is now about 0.2 percent, around where Sweden was in 1941.11

  You may be wondering whether the drops in child mortality explain all the gains in longevity shown in figure 5-1. Are we really living longer, or are we just surviving infancy in greater numbers? After all, the fact that people before the 19th century had an average life expectancy at birth of around 30 years doesn’t mean that everyone dropped dead on their thirtieth birthday. The many children who died pulled the average down, canceling the boost of the people who died of old age, and these seniors can be found in every society. In the time of the Bible, the days of our years were said to be threescore and ten, and that’s the age at which Socrates’s life was cut short in 399 BCE, not by natural causes but by a cup of hemlock. Most hunter-gatherer tribes have plenty of people in their seventies and even some in their eighties. Though a Hadza woman’s life expectancy at birth is 32.5 years, if she makes it to 45 she can expect to live another 21 years.12

  So do those of us who survive the ordeals of childbirth and childhood today live any longer than the survivors of earlier eras? Yes, much longer. Figure 5-4 shows the life expectancy in the United Kingdom at birth, and at different ages from 1 to 70, over the past three centuries.

  Figure 5-4: Life expectancy, UK, 1701–2013

  Sources: Our World in Data, Roser 2016n. Data before 1845 are for England and Wales and come from OECD Clio Infra, van Zanden et al. 2014. Data from 1845 on are for mid-decade years only, and come from the Human Mortality Database, http://www.mortality.org/.

  No matter how old you are, you have more years ahead of you than people of your age did in earlier decades and centuries. A British baby who had survived the hazardous first year of life would have lived to 47 in 1845, 57 in 1905, 72 in 1955, and 81 in 2011. A 30-year-old could look forward to another thirty-three years of life in 1845, another thirty-six in 1905, another forty-three in 1955, and another fifty-two in 2011. If Socrates had been acquitted in 1905, he could have expected to live another nine years; in 1955, another ten; in 2011, another sixteen. An 80-year-old in 1845 had five more years of life; an 80-year-old in 2011, nine years.

  Similar trends, though with lower numbers (so far), have occurred in every part of the world. For example, a 10-year-old Ethiopian in 1950 could expect to live to 44; a 10-year-old Ethiopian today can expect to live to 61. The economist Steven Radelet has pointed out that “the improvements in health among the global poor in the last few decades are so large and widespread that they rank among the greatest achievements in human history. Rarely has the basic well-being of so many people around the world improved so substantially, so quickly. Yet few people are even aware that it is happening.”13

  And no, the extra years of life will not be spent senile in a rocking chair. Of course the longer you live, the more of those years you’ll live as an older person, with its inevitable aches and pains. But bodies that are better at resisting a mortal blow are also better at resisting the lesser assaults of disease, injury, and wear. As the life span is stretched, our run of vigor is stretched out as well, even if not by the same number of years. A heroic project called the Global Burden of Disease has tried to measure this improvement by tallying not just the number of people who drop dead of each of 291 diseases and disabilities, but how many years of healthy life they lose, weighted by the degree to which each condition compromises the quality of t
heir lives. For the world in 1990, the project estimated that 56.8 of the 64.5 years of life that an average person could be expected to live were years of healthy life. And at least in developed countries, where estimates are available for 2010 as well, we know that out of the 4.7 years of additional expected life we gained in those two decades, 3.8 were healthy years.14 Numbers like these show that people today live far more years in the pink of health than their ancestors lived altogether, healthy and infirm years combined. For many people the greatest fear raised by the prospect of a longer life is dementia, but another pleasant surprise has come to light: between 2000 and 2012, the rate among Americans over 65 fell by a quarter, and the average age at diagnosis rose from 80.7 to 82.4 years.15

  There is still more good news. The curves in figure 5-4 are not tapestries of your life that have been drawn out and measured by two of the Fates and will someday be cut by the third. Rather, they are projections from today’s vital statistics, based on the assumption that medical knowledge will be frozen at its current state. It’s not that anyone believes that assumption, but in the absence of clairvoyance about future medical advances we have no other choice. That means you will almost certainly live longer—perhaps much longer—than the numbers you read off the vertical axis.

  People will complain about anything, and in 2001 George W. Bush appointed a President’s Council on Bioethics to deal with the looming threat of biomedical advances that promise longer and healthier lives.16 Its chairman, the physician and public intellectual Leon Kass, decreed that “the desire to prolong youthfulness is an expression of a childish and narcissistic wish incompatible with a devotion to posterity,” and that the years that would be added to other people’s lives were not worth living (“Would professional tennis players really enjoy playing 25 percent more games of tennis?” he asks). Most people would rather decide that for themselves, and even if he is right that “mortality makes life matter,” longevity is not the same as immortality.17 But the fact that experts’ assertions about maximum possible life expectancy have repeatedly been shattered (on average five years after they were published) raises the question of whether longevity will increase indefinitely and someday slip the surly bonds of mortality entirely.18 Should we worry about a world of stodgy multicentenarians who will resist the innovations of ninety-something upstarts and perhaps ban the begetting of pesky children altogether?

  A number of Silicon Valley visionaries are trying to bring that world closer.19 They have funded research institutes which aim not to chip away at mortality one disease at a time but to reverse-engineer the aging process itself and upgrade our cellular hardware to a version without that bug. The result, they hope, will be an increase in the human life span of fifty, a hundred, even a thousand years. In his 2006 bestseller The Singularity Is Near, the inventor Ray Kurzweil forecasts that those of us who make it to 2045 will live forever, thanks to advances in genetics, nanotechnology (such as nanobots that will course through our bloodstream and repair our bodies from the inside), and artificial intelligence, which will not just figure out how to do all this but recursively improve its own intelligence without limit.

  To readers of medical newsletters and other hypochondriacs, the prospects for immortality look rather different. We certainly find incremental improvements to celebrate, such as a decline in the death rate from cancer over the past twenty-five years of around a percentage point a year, saving a million lives in the United States alone.20 But we also are regularly disappointed by miracle drugs that work no better than the placebo, treatments with side effects worse than the disease, and trumpeted benefits that wash out in the meta-analysis. Medical progress today is more Sisyphus than Singularity.

  Lacking the gift of prophecy, no one can say whether scientists will ever find a cure for mortality. But evolution and entropy make it unlikely. Senescence is baked into our genome at every level of organization, because natural selection favors genes that make us vigorous when we are young over those that make us live as long as possible. That bias is built in because of the asymmetry of time: there is a nonzero probability at any moment that we will be felled by an unpreventable accident like a lightning strike or landslide, making the advantage of any costly longevity gene moot. Biologists would have to reprogram thousands of genes or molecular pathways, each with a small and uncertain effect on longevity, to launch the leap to immortality.21

  And even if we were fitted with perfectly tuned biological hardware, the march of entropy would degrade it. As the physicist Peter Hoffman points out, “Life pits biology against physics in mortal combat.” Violently thrashing molecules constantly collide with the machinery of our cells, including the very machinery that staves off entropy by correcting errors and repairing damage. As damage to the various damage-control systems accumulates, the risk of collapse increases exponentially, sooner or later swamping whatever protections biomedical science has given us against constant risks like cancer and organ failure.22

  In my view the best projection of the outcome of our multicentury war on death is Stein’s Law—“Things that can’t go on forever don’t”—as amended by Davies’s Corollary—“Things that can’t go on forever can go on much longer than you think.”

  CHAPTER 6

  HEALTH

  How do we explain the gift of life that has been granted to more and more of our species since the end of the 18th century? The timing offers a clue. In The Great Escape, Deaton writes, “Ever since people rebelled against authority in the Enlightenment, and set about using the force of reason to make their lives better, they have found a way to do so, and there is little doubt that they will continue to win victories against the forces of death.”1 The gains in longevity celebrated in the previous chapter are the spoils of victory against several of those forces—disease, starvation, war, homicide, accidents—and in this chapter and subsequent ones I will tell the story of each.

  For most of human history, the strongest force of death was infectious disease, the nasty feature of evolution in which small, rapidly reproducing organisms make their living at our expense and hitch a ride from body to body in bugs, worms, and bodily effluvia. Epidemics killed by the millions, wiping out entire civilizations, and visited sudden misery on local populations. To take just one example, yellow fever, a viral disease transmitted by mosquitoes, was so named because its victims turned that color before dying in agony. According to an account of an 1878 Memphis epidemic, the sick had “crawled into holes twisted out of shape, their bodies discovered later only by the stench of their decaying flesh. . . . [A mother was found dead] with her body sprawled across the bed . . . black vomit like coffee grounds spattered all over . . . the children rolling on the floor, groaning.”2

  The rich were not spared: in 1836, the wealthiest man in the world, Nathan Meyer Rothschild, died of an infected abscess. Nor the powerful: various British monarchs were cut down by dysentery, smallpox, pneumonia, typhoid, tuberculosis, and malaria. American presidents, too, were vulnerable: William Henry Harrison fell ill shortly after his inauguration in 1841 and died of septic shock thirty-one days later, and James Polk succumbed to cholera three months after leaving office in 1849. As recently as 1924, the sixteen-year-old son of a sitting president, Calvin Coolidge Jr., died of an infected blister he got while playing tennis.

  Ever-creative Homo sapiens had long fought back against disease with quackery such as prayer, sacrifice, bloodletting, cupping, toxic metals, homeopathy, and squeezing a hen to death against an infected body part. But starting in the late 18th century with the invention of vaccination, and accelerating in the 19th with acceptance of the germ theory of disease, the tide of battle began to turn. Handwashing, midwifery, mosquito control, and especially the protection of drinking water by public sewerage and chlorinated tap water would come to save billions of lives. Before the 20th century, cities were piled high in excrement, their rivers and lakes viscous with waste, and their residents drinking and washing their clothes in putrid brown liquid
.3 Epidemics were blamed on miasmas—foul-smelling air—until John Snow (1813–1858), the first epidemiologist, determined that cholera-stricken Londoners got their water from an intake pipe that was downstream from an outflow of sewage. Doctors themselves used to be a major health hazard as they went from autopsy to examining room in black coats encrusted with dried blood and pus, probed their patients’ wounds with unwashed hands, and sewed them up with sutures they kept in their buttonholes, until Ignaz Semmelweis (1818–1865) and Joseph Lister (1827–1912) got them to sterilize their hands and equipment. Antisepsis, anesthesia, and blood transfusions allowed surgery to cure rather than torture and mutilate, and antibiotics, antitoxins, and countless other medical advances further beat back the assault of pestilence.

  The sin of ingratitude may not have made the Top Seven, but according to Dante it consigns the sinners to the ninth circle of Hell, and that’s where post-1960s intellectual culture may find itself because of its amnesia for the conquerors of disease. It wasn’t always that way. When I was a boy, a popular literary genre for children was the heroic biography of a medical pioneer such as Edward Jenner, Louis Pasteur, Joseph Lister, Frederick Banting, Charles Best, William Osler, or Alexander Fleming. On April 12, 1955, a team of scientists announced that Jonas Salk’s vaccine against polio—the disease that had killed thousands a year, paralyzed Franklin Roosevelt, and sent many children into iron lungs—was proven safe. According to Richard Carter’s history of the discovery, on that day “people observed moments of silence, rang bells, honked horns, blew factory whistles, fired salutes, . . . took the rest of the day off, closed their schools or convoked fervid assemblies therein, drank toasts, hugged children, attended church, smiled at strangers, and forgave enemies.”4 The city of New York offered to honor Salk with a ticker-tape parade, which he politely declined.