Enlightenment Now
When all these objections are exhausted, I often see people racking their brains to find some way in which the news cannot be as good as the data suggest. In desperation, they turn to semantics.
Isn’t Internet trolling a form of violence? Isn’t strip-mining a form of violence? Isn’t inequality a form of violence? Isn’t pollution a form of violence? Isn’t poverty a form of violence? Isn’t consumerism a form of violence? Isn’t divorce a form of violence? Isn’t advertising a form of violence? Isn’t keeping statistics on violence a form of violence?
As wonderful as metaphor is as a rhetorical device, it is a poor way to assess the state of humanity. Moral reasoning requires proportionality. It may be upsetting when someone says mean things on Twitter, but it is not the same as the slave trade or the Holocaust. It also requires distinguishing rhetoric from reality. Marching into a rape crisis center and demanding to know what they have done about the rape of the environment does nothing for rape victims and nothing for the environment. Finally, improving the world requires an understanding of cause and effect. Though primitive moral intuitions tend to lump bad things together and find a villain to blame them on, there is no coherent phenomenon of “bad things” that we can seek to understand and eliminate. (Entropy and evolution will generate them in profusion.) War, crime, pollution, poverty, disease, and incivility are evils that may have little in common, and if we want to reduce them, we can’t play word games that make it impossible even to discuss them individually.
* * *
I have run through these objections to prepare the way for my presentation of other measures of human progress. The incredulous reaction to Better Angels convinced me that it isn’t just the Availability heuristic that makes people fatalistic about progress. Nor can the media’s fondness for bad news be blamed entirely on a cynical chase for eyeballs and clicks. No, the psychological roots of progressophobia run deeper.
The deepest is a bias that has been summarized in the slogan “Bad is stronger than good.”21 The idea can be captured in a set of thought experiments suggested by Tversky.22 How much better can you imagine yourself feeling than you are feeling right now? How much worse can you imagine yourself feeling? In answering the first hypothetical, most of us can imagine a bit more of a spring in our step or a twinkle in our eye, but the answer to the second one is: it’s bottomless. This asymmetry in mood can be explained by an asymmetry in life (a corollary of the Law of Entropy). How many things could happen to you today that would leave you much better off? How many things could happen that would leave you much worse off? Once again, to answer the first question, we can all come up with the odd windfall or stroke of good luck, but the answer to the second one is: it’s endless. But we needn’t rely on our imaginations. The psychological literature confirms that people dread losses more than they look forward to gains, that they dwell on setbacks more than they savor good fortune, and that they are more stung by criticism than they are heartened by praise. (As a psycholinguist I am compelled to add that the English language has far more words for negative emotions than for positive ones.)23
One exception to the Negativity bias is found in autobiographical memory. Though we tend to remember bad events as well as we remember good ones, the negative coloring of the misfortunes fades with time, particularly the ones that happened to us.24 We are wired for nostalgia: in human memory, time heals most wounds. Two other illusions mislead us into thinking that things ain’t what they used to be: we mistake the growing burdens of maturity and parenthood for a less innocent world, and we mistake a decline in our own faculties for a decline in the times.25 As the columnist Franklin Pierce Adams pointed out, “Nothing is more responsible for the good old days than a bad memory.”
Intellectual culture should strive to counteract our cognitive biases, but all too often it reinforces them. The cure for the Availability bias is quantitative thinking, but the literary scholar Steven Connor has noted that “there is in the arts and humanities an exceptionless consensus about the encroaching horror of the domain of number.”26 This “ideological rather than accidental innumeracy” leads writers to notice, for example, that wars take place today and wars took place in the past and to conclude that “nothing has changed”—failing to acknowledge the difference between an era with a handful of wars that collectively kill in the thousands and an era with dozens of wars that collectively killed in the millions. And it leaves them unappreciative of systemic processes that eke out incremental improvements over the long term.
Nor is intellectual culture equipped to treat the Negativity bias. Indeed, our vigilance for bad things around us opens up a market for professional curmudgeons who call our attention to bad things we may have missed. Experiments have shown that a critic who pans a book is perceived as more competent than a critic who praises it, and the same may be true of critics of society.27 “Always predict the worst, and you’ll be hailed as a prophet,” the musical humorist Tom Lehrer once advised. At least since the time of the Hebrew prophets, who blended their social criticism with forewarnings of disaster, pessimism has been equated with moral seriousness. Journalists believe that by accentuating the negative they are discharging their duty as watchdogs, muckrakers, whistleblowers, and afflicters of the comfortable. And intellectuals know they can attain instant gravitas by pointing to an unsolved problem and theorizing that it is a symptom of a sick society.
The converse is true as well. The financial writer Morgan Housel has observed that while pessimists sound like they’re trying to help you, optimists sound like they’re trying to sell you something.28 Whenever someone offers a solution to a problem, critics will be quick to point out that it is not a panacea, a silver bullet, a magic bullet, or a one-size-fits-all solution; it’s just a Band-Aid or a quick technological fix that fails to get at the root causes and will blow back with side effects and unintended consequences. Of course, since nothing is a panacea and everything has side effects (you can’t do just one thing), these common tropes are little more than a refusal to entertain the possibility that anything can ever be improved.29
Pessimism among the intelligentsia can also be a form of one-upmanship. A modern society is a league of political, industrial, financial, technological, military, and intellectual elites, all competing for prestige and influence, and with differing responsibilities for making the society run. Complaining about modern society can be a backhanded way of putting down one’s rivals—for academics to feel superior to businesspeople, businesspeople to feel superior to politicians, and so on. As Thomas Hobbes noted in 1651, “Competition of praise inclineth to a reverence of antiquity. For men contend with the living, not with the dead.”
Pessimism, to be sure, has a bright side. The expanding circle of sympathy makes us concerned about harms that would have passed unnoticed in more callous times. Today we recognize the Syrian civil war as a humanitarian tragedy. The wars of earlier decades, such as the Chinese Civil War, the partition of India, and the Korean War, are seldom remembered that way, though they killed and displaced more people. When I grew up, bullying was considered a natural part of boyhood. It would have strained belief to think that someday the president of the United States would deliver a speech about its evils, as Barack Obama did in 2011. As we care about more of humanity, we’re apt to mistake the harms around us for signs of how low the world has sunk rather than how high our standards have risen.
But relentless negativity can itself have unintended consequences, and recently a few journalists have begun to point them out. In the wake of the 2016 American election, the New York Times writers David Bornstein and Tina Rosenberg reflected on the media’s role in its shocking outcome:
Trump was the beneficiary of a belief—near universal in American journalism—that “serious news” can essentially be defined as “what’s going wrong.” . . . For decades, journalism’s steady focus on problems and seemingly incurable pathologies was preparing the soil that allowed Trump’s seeds of discontent and d
espair to take root. . . . One consequence is that many Americans today have difficulty imagining, valuing or even believing in the promise of incremental system change, which leads to a greater appetite for revolutionary, smash-the-machine change.30
Bornstein and Rosenberg don’t blame the usual culprits (cable TV, social media, late-night comedians) but instead trace it to the shift during the Vietnam and Watergate eras from glorifying leaders to checking their power—with an overshoot toward indiscriminate cynicism, in which everything about America’s civic actors invites an aggressive takedown.
If the roots of progressophobia lie in human nature, is my suggestion that it is on the rise itself an illusion of the Availability bias? Anticipating the methods I will use in the rest of the book, let’s look at an objective measure. The data scientist Kalev Leetaru applied a technique called sentiment mining to every article published in the New York Times between 1945 and 2005, and to an archive of translated articles and broadcasts from 130 countries between 1979 and 2010. Sentiment mining assesses the emotional tone of a text by tallying the number and contexts of words with positive and negative connotations, like good, nice, terrible, and horrific. Figure 4-1 shows the results. Putting aside the wiggles and waves that reflect the crises of the day, we see that the impression that the news has become more negative over time is real. The New York Times got steadily more morose from the early 1960s to the early 1970s, lightened up a bit (but just a bit) in the 1980s and 1990s, and then sank into a progressively worse mood in the first decade of the new century. News outlets in the rest of the world, too, became gloomier and gloomier from the late 1970s to the present day.
So has the world really gone steadily downhill during these decades? Keep figure 4-1 in mind as we examine the state of humanity in the chapters to come.
Figure 4-1: Tone of the news, 1945–2010
Source: Leetaru 2011. Plotted by month, beginning in January.
* * *
What is progress? You might think that the question is so subjective and culturally relative as to be forever unanswerable. In fact, it’s one of the easier questions to answer.
Most people agree that life is better than death. Health is better than sickness. Sustenance is better than hunger. Abundance is better than poverty. Peace is better than war. Safety is better than danger. Freedom is better than tyranny. Equal rights are better than bigotry and discrimination. Literacy is better than illiteracy. Knowledge is better than ignorance. Intelligence is better than dull-wittedness. Happiness is better than misery. Opportunities to enjoy family, friends, culture, and nature are better than drudgery and monotony.
All these things can be measured. If they have increased over time, that is progress.
Granted, not everyone would agree on the exact list. The values are avowedly humanistic, and leave out religious, romantic, and aristocratic virtues like salvation, grace, sacredness, heroism, honor, glory, and authenticity. But most would agree that it’s a necessary start. It’s easy to extoll transcendent values in the abstract, but most people prioritize life, health, safety, literacy, sustenance, and stimulation for the obvious reason that these goods are a prerequisite to everything else. If you’re reading this, you are not dead, starving, destitute, moribund, terrified, enslaved, or illiterate, which means that you’re in no position to turn your nose up at these values—or to deny that other people should share your good fortune.
As it happens, the world does agree on these values. In the year 2000, all 189 members of the United Nations, together with two dozen international organizations, agreed on eight Millennium Development Goals for the year 2015 that blend right into this list.31
And here is a shocker: The world has made spectacular progress in every single measure of human well-being. Here is a second shocker: Almost no one knows about it.
Information about human progress, though absent from major news outlets and intellectual forums, is easy enough to find. The data are not entombed in dry reports but are displayed in gorgeous Web sites, particularly Max Roser’s Our World in Data, Marian Tupy’s HumanProgress, and Hans Rosling’s Gapminder. (Rosling learned that not even swallowing a sword during a 2007 TED talk was enough to get the world’s attention.) The case has been made in beautifully written books, some by Nobel laureates, which flaunt the news in their titles—Progress, The Progress Paradox, Infinite Progress, The Infinite Resource, The Rational Optimist, The Case for Rational Optimism, Utopia for Realists, Mass Flourishing, Abundance, The Improving State of the World, Getting Better, The End of Doom, The Moral Arc, The Big Ratchet, The Great Escape, The Great Surge, The Great Convergence.32 (None was recognized with a major prize, but over the period in which they appeared, Pulitzers in nonfiction were given to four books on genocide, three on terrorism, two on cancer, two on racism, and one on extinction.) And for those whose reading habits tend toward listicles, recent years have offered “Five Amazing Pieces of Good News Nobody Is Reporting,” “Five Reasons Why 2013 Was the Best Year in Human History,” “Seven Reasons the World Looks Worse Than It Really Is,” “29 Charts and Maps That Show the World Is Getting Much, Much Better,” “40 Ways the World Is Getting Better,” and my favorite, “50 Reasons We’re Living Through the Greatest Period in World History.” Let’s look at some of those reasons.
CHAPTER 5
LIFE
The struggle to stay alive is the primal urge of animate beings, and humans deploy their ingenuity and conscious resolve to stave off death as long as possible. “Choose life, so that you and your children may live,” commanded the God of the Hebrew Bible; “Rage, rage against the dying of the light,” adjured Dylan Thomas. A long life is the ultimate blessing.
How long do you think an average person in the world can be expected to live today? Bear in mind that the global average is dragged down by the premature deaths from hunger and disease in the populous countries in the developing world, particularly by the deaths of infants, who mix a lot of zeroes into the average.
The answer for 2015 is 71.4 years.1 How close is that to your guess? In a recent survey Hans Rosling found that less than one in four Swedes guessed that it was that high, a finding consistent with the results of other multinational surveys of opinions on longevity, literacy, and poverty in what Rosling dubbed the Ignorance Project. The logo of the project is a chimpanzee, because, as Rosling explained, “If for each question I wrote the alternatives on bananas, and asked chimpanzees in the zoo to pick the right answers, they’d have done better than the respondents.” The respondents, including students and professors of global health, were not so much ignorant as fallaciously pessimistic.2
Figure 5-1, a plot from Max Roser of life expectancy over the centuries, displays a general pattern in world history. At the time when the lines begin, in the mid-18th century, life expectancy in Europe and the Americas was around 35, where it had been parked for the 225 previous years for which we have data.3 Life expectancy for the world as a whole was 29. These numbers are in the range of expected life spans for most of human history. The life expectancy of hunter-gatherers is around 32.5, and it probably decreased among the peoples who first took up farming because of their starchy diet and the diseases they caught from their livestock and each other. It returned to the low 30s by the Bronze Age, where it stayed put for thousands of years, with small fluctuations across centuries and regions.4 This period in human history may be called the Malthusian Era, when any advance in agriculture or health was quickly canceled by the resulting bulge in population, though “era” is an odd term for 99.9 percent of our species’ existence.
Figure 5-1: Life expectancy, 1771–2015
Sources: Our World in Data, Roser 2016n, based on data from Riley 2005 for the years before 2000 and from the World Health Organization and the World Bank for the subsequent years. Updated with data provided by Max Roser.
But starting in the 19th century, the world embarked on the Great Escape, the economist Angus Deaton’s term for
humanity’s release from its patrimony of poverty, disease, and early death. Life expectancy began to rise, picked up speed in the 20th century, and shows no signs of slowing down. As the economic historian Johan Norberg points out, we tend to think that “we approach death by one year for every year we age, but during the twentieth century, the average person approached death by just seven months for every year they aged.” Thrillingly, the gift of longevity is spreading to all of humankind, including the world’s poorest countries, and at a much faster pace than it did in the rich ones. “Life expectancy in Kenya increased by almost ten years between 2003 and 2013,” Norberg writes. “After having lived, loved and struggled for a whole decade, the average person in Kenya had not lost a single year of their remaining lifetime. Everyone got ten years older, yet death had not come a step closer.”5
As a result, inequality in life expectancy, which opened up during the Great Escape when a few fortunate countries broke away from the pack, is shrinking as the rest catch up. In 1800, no country in the world had a life expectancy above 40. By 1950, it had grown to around 60 in Europe and the Americas, leaving Africa and Asia far behind. But since then Asia has shot up at twice the European rate, and Africa at one and a half times the rate. An African born today can expect to live as long as a person born in the Americas in 1950 or in Europe in the 1930s. The average would have been longer still were it not for the calamity of AIDS, which caused the terrible trough in the 1990s before antiretroviral drugs started to bring it under control.