And how much thought have you given lately to Karl Landsteiner? Karl who? He only saved a billion lives by his discovery of blood groups. Or how about these other heroes?
Scientist
Discovery
Lives Saved
Abel Wolman (1892–1982) and Linn Enslow (1891–1957)
chlorination of water
177 million
William Foege (1936– )
smallpox eradication strategy
131 million
Maurice Hilleman (1919–2005)
eight vaccines
129 million
John Enders (1897–1985)
measles vaccine
120 million
Howard Florey (1898–1968)
penicillin
82 million
Gaston Ramon (1886–1963)
diphtheria and tetanus vaccines
60 million
David Nalin (1941– )
oral rehydration therapy
54 million
Paul Ehrlich (1854–1915)
diphtheria and tetanus antitoxins
42 million
Andreas Grüntzig (1939–1985)
angioplasty
15 million
Grace Eldering (1900–1988) and Pearl Kendrick (1890–1980)
whooping cough vaccine
14 million
Gertrude Elion (1918–1999)
rational drug design
5 million
The researchers who assembled these conservative estimates calculate that more than five billion lives have been saved (so far) by the hundred or so scientists they selected.5 Of course hero stories don’t do justice to the way science is really done. Scientists stand on the shoulders of giants, collaborate in teams, toil in obscurity, and aggregate ideas across worldwide webs. But whether it’s the scientists or the science that is ignored, the neglect of the discoveries that transformed life for the better is an indictment of our appreciation of the modern human condition.
As a psycholinguist who once wrote an entire book on the past tense, I can single out my favorite example in the history of the English language.6 It comes from the first sentence of a Wikipedia entry:
Smallpox was an infectious disease caused by either of two virus variants, Variola major and Variola minor.
Yes, “smallpox was.” The disease that got its name from the painful pustules that cover the victim’s skin, mouth, and eyes and that killed more than 300 million people in the 20th century has ceased to exist. (The last case was diagnosed in Somalia in 1977.) For this astounding moral triumph we can thank, among others, Edward Jenner, who discovered vaccination in 1796, the World Health Organization, which in 1959 set the audacious goal of eradicating the disease, and William Foege, who figured out that vaccinating small but strategically chosen portions of the vulnerable populations would do the job. In Getting Better, the economist Charles Kenny comments:
The total cost of the program over those ten years . . . was in the region of $312 million—perhaps 32 cents per person in infected countries. The eradication program cost about the same as producing five recent Hollywood blockbusters, or the wing of a B-2 bomber, or a little under one-tenth the cost of Boston’s recent road-improvement project nicknamed the Big Dig. However much one admires the improved views of the Boston waterfront, the lines of the stealth bomber, or the acting skills of Keira Knightley in Pirates of the Caribbean, or indeed of the gorilla in King Kong, this still seems like a very good deal.7
Even as a resident of the Boston waterfront, I’d have to agree. But this stupendous achievement was only the beginning. Wikipedia’s definition of rinderpest (cattle plague), which starved millions of farmers and herders throughout history by wiping out their livestock, is also in the past tense. And four other sources of misery in the developing world are slated for eradication. Jonas Salk did not live to see the Global Polio Eradication Initiative approach its goal: by 2016 the disease had been beaten back to just thirty-seven cases in three countries (Afghanistan, Pakistan, and Nigeria), the lowest in history, with an even lower rate thus far in 2017.8 Guinea worm is a three-foot-long parasite that worms its way into the victim’s lower limbs and diabolically forms a painful blister. When the sufferer soaks his or her foot for relief, the blister bursts, releasing thousands of larvae into the water, which other people drink, continuing the cycle. The only treatment consists of pulling the worm out over several days or weeks. But thanks to a three-decade campaign of education and water treatment by the Carter Center, the number of cases fell from 3.5 million in twenty-one countries in 1986 to just twenty-five cases in three countries in 2016 (and just three in one country in the first quarter of 2017).9 Elephantiasis, river blindness, and blinding trachoma, whose symptoms are as bad as they sound, may also be defined in the past tense by 2030, and measles, rubella, yaws, sleeping sickness, and hookworm are in epidemiologists’ sights as well.10 (Will any of these triumphs be heralded with moments of silence, ringing bells, honking horns, people smiling at strangers and forgiving their enemies?)
Even diseases that are not obliterated are being decimated. Between 2000 and 2015, the number of deaths from malaria (which in the past killed half the people who had ever lived) fell by 60 percent. The World Health Organization has adopted a plan to reduce the rate by another 90 percent by 2030, and to eliminate it from thirty-five of the ninety-seven countries in which it is endemic today (just as it was eliminated from the United States, where it had been endemic until 1951).11 The Bill & Melinda Gates Foundation has adopted the goal of eradicating it altogether.12 As we saw in chapter 5, in the 1990s HIV/AIDS in Africa was a setback for humanity’s progress in lengthening life spans. But the tide turned in the next decade, and the global death rate for children was cut in half, emboldening the UN to agree in 2016 to a plan to end the AIDS epidemic (though not necessarily to eradicate the virus) by 2030.13 Figure 6-1 shows that between 2000 and 2013 the world also saw massive reductions in the number of children dying from the five most lethal infectious diseases. In all, the control of infectious disease since 1990 has saved the lives of more than a hundred million children.14
Figure 6-1: Childhood deaths from infectious disease, 2000–2013
Source: Child Health Epidemiology Reference Group of the World Health Organization, Liu et al. 2014, supplementary appendix.
And in the most ambitious plan of all, a team of global health experts led by the economists Dean Jamison and Lawrence Summers have laid out a roadmap for “a grand convergence in global health” by 2035, when infectious, maternal, and child deaths everywhere in the world could be reduced to the levels found in the healthiest middle-income countries today.15
As impressive as the conquest of infectious disease in Europe and America was, the ongoing progress among the global poor is even more astonishing. Part of the explanation lies in economic development (chapter 8), because a richer world is a healthier world. Part lies in the expanding circle of sympathy, which inspired global leaders such as Bill Gates, Jimmy Carter, and Bill Clinton to make their legacy the health of the poor in distant continents rather than glittering buildings close to home. George W. Bush, for his part, has been praised by even his harshest critics for his policy on African AIDS relief, which saved millions of lives.
r />
But the most powerful contributor was science. “It is knowledge that is the key,” Deaton argues. “Income—although important both in and of itself and as a component of wellbeing . . .—is not the ultimate cause of wellbeing.”16 The fruits of science are not just high-tech pharmaceuticals such as vaccines, antibiotics, antiretrovirals, and deworming pills. They also comprise ideas—ideas that may be cheap to implement and obvious in retrospect, but which save millions of lives. Examples include boiling, filtering, or adding bleach to water; washing hands; giving iodine supplements to pregnant women; breast-feeding and cuddling infants; defecating in latrines rather than in fields, streets, and waterways; protecting sleeping children with insecticide-impregnated bed nets; and treating diarrhea with a solution of salt and sugar in clean water. Conversely, progress can be reversed by bad ideas, such as the conspiracy theory spread by the Taliban and Boko Haram that vaccines sterilize Muslim girls, or the one spread by affluent American activists that vaccines cause autism. Deaton notes that even the idea that lies at the core of the Enlightenment—knowledge can make us better off—may come as a revelation in the parts of the world where people are resigned to their poor health, never dreaming that changes to their institutions and norms could improve it.17
CHAPTER 7
SUSTENANCE
Together with senescence, childbirth, and pathogens, another mean trick has been played on us by evolution and entropy: our ceaseless need for energy. Famine has long been part of the human condition. The Hebrew Bible tells of seven lean years in Egypt; the Christian Bible has Famine as one of the four horsemen of the apocalypse. Well into the 19th century a crop failure could bring sudden misery even to privileged parts of the world. Johan Norberg quotes the childhood reminiscence of a contemporary of one of his ancestors in Sweden in the winter of 1868:
We often saw mother weeping to herself, and it was hard on a mother, not having any food to put on the table for her hungry children. Emaciated, starving children were often seen going from farm to farm, begging for a few crumbs of bread. One day three children came to us, crying and begging for something to still the pangs of hunger. Sadly, her eyes brimming with tears, our mother was forced to tell them that we had nothing but a few crumbs of bread which we ourselves needed. When we children saw the anguish in the unknown children’s supplicatory eyes, we burst into tears and begged mother to share with them what crumbs we had. Hesitantly she acceded to our request, and the unknown children wolfed down the food before going on to the next farm, which was a good way off from our home. The following day all three were found dead between our farm and the next.1
The historian Fernand Braudel has documented that premodern Europe suffered from famines every few decades.2 Desperate peasants would harvest grain before it was ripe, eat grass or human flesh, and pour into cities to beg. Even in good times, many would get the bulk of their calories from bread or gruel, and not many at that: in The Escape from Hunger and Premature Death, 1700–2100, the economist Robert Fogel noted that “the energy value of the typical diet in France at the start of the eighteenth century was as low as that of Rwanda in 1965, the most malnourished nation for that year.”3 Many of those who were not starving were too weak to work, which locked them into poverty. Hungry Europeans titillated themselves with food pornography, such as tales of Cockaigne, a country where pancakes grew on trees, the streets were paved with pastry, roasted pigs wandered around with knives in their backs for easy carving, and cooked fish jumped out of the water and landed at one’s feet.
Today we live in Cockaigne, and our problem is not too few calories but too many. As the comedian Chris Rock observed, “This is the first society in history where the poor people are fat.” With the usual first-world ingratitude, modern social critics rail against the obesity epidemic with a level of outrage that might be appropriate for a famine (that is, when they are not railing at fat-shaming, slender fashion models, or eating disorders). Though obesity surely is a public health problem, by the standards of history it’s a good problem to have.
What about the rest of the world? The hunger that many Westerners associate with Africa and Asia is by no means a modern phenomenon. India and China have always been vulnerable to famine, because millions of people subsisted on rice that was watered by erratic monsoons or fragile irrigation systems and had to be transported across great distances. Braudel recounts the testimony of a Dutch merchant who was in India during a famine in 1630–31:
“Men abandoned towns and villages and wandered helplessly. It was easy to recognize their condition: eyes sunk deep in the head, lips pale and covered with slime, the skin hard, with the bones showing through, the belly nothing but a pouch hanging down empty. . . . One would cry and howl for hunger, while another lay stretched on the ground dying in misery.” The familiar human dramas followed: wives and children abandoned, children sold by parents, who either abandoned them or sold themselves in order to survive, collective suicides. . . . Then came the stage when the starving split open the stomachs of the dead or dying and “drew at the entrails to fill their own bellies.” “Many hundred thousands of men died of hunger, so that the whole country was covered with corpses lying unburied, which caused such a stench that the whole air was filled and infected with it. . . . In the village of Susuntra . . . human flesh was sold in open market.”4
But in recent times the world has been blessed with another remarkable and little-noticed advance: in spite of burgeoning numbers, the developing world is feeding itself. This is most obvious in China, whose 1.3 billion people now have access to an average of 3,100 calories per person per day, which, according to US government guidelines, is the number needed by a highly active young man.5 India’s billion people get an average of 2,400 calories a day, the number recommended for a highly active young woman or an active middle-aged man. The figure for the continent of Africa comes in between the two at 2,600.6 Figure 7-1, which plots available calories for a representative sample of developed and developing nations and for the world as a whole, shows a pattern familiar from earlier graphs: hardship everywhere before the 19th century, rapid improvement in Europe and the United States over the next two centuries, and, in recent decades, the developing world catching up.
Figure 7-1: Calories, 1700–2013
Sources: United States, England, and France: Our World in Data, Roser 2016d, based on data from Fogel 2004. China, India, and the World: Food and Agriculture Organization of the United Nations, http://www.fao.org/faostat/en/#data.
The numbers plotted in figure 7-1 are averages, and they would be a misleading index of well-being if they were just lifted by rich people scarfing down more calories (if no one was getting fat except Mama Cass). Fortunately, the numbers reflect an increase in the availability of calories throughout the range, including the bottom. When children are underfed, their growth is stunted, and throughout their lives they have a higher risk of getting sick and dying. Figure 7-2 shows the proportion of children who are stunted in a representative sample of countries which have data for the longest spans of time. Though the proportion of stunted children in poor countries like Kenya and Bangladesh is deplorable, we see that in just two decades the rate of stunting has been cut in half. Countries like Colombia and China also had high rates of stunting not long ago and have managed to bring them even lower.
Figure 7-2: Childhood stunting, 1966–2014
Source: Our World in Data, Roser 2016j, based on data from the World Health Organization’s Nutrition Landscape Information System, http://www.who.int/nutrition/nlis/en/.
Figure 7-3 offers another look at how the world has been feeding the hungry. It shows the rate of undernourishment (a year or more of insufficient food) for developing countries in five regions and for the world as a whole. In developed countries, which are not included in the estimates, the rate of undernourishment was less than 5 percent during the entire period, statistically indistinguishable from zero. Though 13 percent of people in the developi
ng world being undernourished is far too much, it’s better than 35 percent, which was the level forty-five years earlier, or for that matter 50 percent, an estimate for the entire world in 1947 (not shown on the graph).7 Remember that these figures are proportions. The world added almost five billion people in those seventy years, which means that as the world was reducing the rate of hunger it was also feeding billions of additional mouths.
Figure 7-3: Undernourishment, 1970–2015
Source: Our World in Data, Roser 2016j, based on data from the Food and Agriculture Organization 2014, also reported in http://www.fao.org/economic/ess/ess-fs/ess-fadata/en/.
Not only has chronic undernourishment been in decline, but so have catastrophic famines—the crises that kill people in large numbers and cause widespread wasting (the condition of being two standard deviations below one’s expected weight) and kwashiorkor (the protein deficiency which causes the swollen bellies of the children in photographs that have become icons of famine).8 Figure 7-4 shows the number of deaths in major famines in each decade for the past 150 years, scaled by world population at the time.