Enlightenment Now
Reasoning thus has deep evolutionary roots. The citizen scientist Louis Liebenberg has studied the San hunter-gatherers of the Kalahari Desert (the “Bushmen”), one of the world’s most ancient cultures. They engage in the oldest form of the chase, persistence hunting, in which humans, with their unique ability to dump heat through sweat-slicked skin, pursue a furry mammal in the midday sun until it collapses of heat stroke. Since most mammals are swifter than humans and dart out of sight as soon as they are spotted, persistence hunters track them by their spoor, which means inferring the animal’s species, sex, age, and level of fatigue, and thus its likely direction of flight, from the hoofprints, bent stems, and displaced pebbles it leaves behind. The San do not just engage in inference—deducing, for example, that agile springboks tread deeply with pointed hooves to get a good grip, whereas heavy kudus tread flat-footed to support their weight. They also engage in reasoning—articulating the logic behind their inferences to persuade their companions or be persuaded in their turn. Liebenberg observed that Kalahari trackers don’t accept arguments from authority. A young tracker can challenge the majority opinion of his elders, and if his interpretation of the evidence is convincing, he can bring them around, increasing the group’s accuracy.8
And if you’re still tempted to excuse modern dogma and superstition by saying that it’s only human, consider Liebenberg’s account of scientific skepticism among the San:
Three trackers, !Nate, /Uase and Boroh//xao, of Lone Tree in the central Kalahari, told me that the Monotonous Lark (Mirafra passerina) only sings after it has rained, because “it is happy that it rained.” One tracker, Boroh//xao, told me that when the bird sings, it dries out the soil, making the roots good to eat. Afterwards, !Nate and /Uase told me that Boroh//xao was wrong—it is not the bird that dries out the soil, it is the sun that dries out the soil. The bird is only telling them that the soil will dry out in the coming months and that it is the time of the year when the roots are good to eat. . . .
!Namka, a tracker from Bere in the central Kalahari, Botswana, told me the myth of how the sun is like an eland, which crosses the sky and is then killed by people who live in the west. The red glow in the sky when the sun goes down is the blood of the eland. After they have eaten it, they throw the shoulder blade across the sky back to the east, where it falls into a pool and grows into a new sun. Sometimes, it is said, you can hear the swishing noise of the shoulder blade flying through the air. After telling me the story in great detail, he told me that he thinks that the “Old People” lied, because he has never seen . . . the shoulder blade fly through the sky or heard the swishing noise.9
Of course, none of this contradicts the discovery that humans are vulnerable to illusions and fallacies. Our brains are limited in their capacity to process information and evolved in a world without science, scholarship, and other forms of fact-checking. But reality is a mighty selection pressure, so a species that lives by ideas must have evolved with an ability to prefer correct ones. The challenge for us today is to design an informational environment in which that ability prevails over the ones that lead us into folly. The first step is to pinpoint why an otherwise intelligent species is so easily led into folly.
* * *
The 21st century, an age of unprecedented access to knowledge, has also seen maelstroms of irrationality, including the denial of evolution, vaccine safety, and anthropogenic climate change, and the promulgation of conspiracy theories, from 9/11 to the size of Donald Trump’s popular vote. Fans of rationality are desperate to understand the paradox, but in a bit of irrationality of their own, they seldom look at data that might explain it.
The standard explanation of the madness of crowds is ignorance: a mediocre education system has left the populace scientifically illiterate, at the mercy of their cognitive biases, and thus defenseless against airhead celebrities, cable-news gladiators, and other corruptions from popular culture. The standard solution is better schooling and more outreach to the public by scientists on television, social media, and popular Web sites. As an outreaching scientist I’ve always found this theory appealing, but I’ve come to realize it’s wrong, or at best a small part of the problem.
Consider these questions about evolution:
During the Industrial Revolution of the 19th century, the English countryside got covered in soot, and the Peppered Moth became, on average, darker in color. How did this happen?
A. In order to blend in with their surroundings, the moths had to become darker in color.
B. The moths with darker color were less likely to get eaten and were more likely to reproduce.
After a year the average test score at a private high school increased by thirty points. Which explanation for this change is most analogous to Darwin’s explanation for the adaptation of species?
A. The school no longer admitted children of wealthy alumni unless they met the same standards as everyone else.
B. Since the last test, each returning student had grown more knowledgeable.
The correct answers are B and A. The psychologist Andrew Shtulman gave high school and university students a battery of questions like this which probed for a deep understanding of the theory of natural selection, in particular the key idea that evolution consists of changes in the proportion of a population with adaptive traits rather than a transformation of the population so that its traits would be more adaptive. He found no correlation between performance on the test and a belief that natural selection explains the origin of humans. People can believe in evolution without understanding it, and vice versa.10 In the 1980s several biologists got burned when they accepted invitations to debate creationists who turned out to be not Bible-thumping yokels but well-briefed litigators who cited cutting-edge research to sow uncertainty as to whether the science was complete.
Professing a belief in evolution is not a gift of scientific literacy, but an affirmation of loyalty to a liberal secular subculture as opposed to a conservative religious one. In 2010 the National Science Foundation dropped the following item from its test of scientific literacy: “Human beings, as we know them today, developed from earlier species of animals.” The reason for that change was not, as scientists howled, because the NSF had given in to creationist pressure to bowdlerize evolution from the scientific canon. It was that the correlation between performance on that item and on every other item on the test (such as “An electron is smaller than an atom” and “Antibiotics kill viruses”) was so low that it was taking up space in the test that could go to more diagnostic items. The item, in other words, was effectively a test of religiosity rather than scientific literacy.11 When the item was prefaced with “According to the theory of evolution,” so that scientific understanding was divorced from cultural allegiance, religious and nonreligious test-takers responded the same.12
Or consider these questions:
Climate scientists believe that if the North Pole icecap melted as a result of human-caused global warming, global sea levels would rise. True or False?
What gas do most scientists believe causes temperatures in the atmosphere to rise? Is it carbon dioxide, hydrogen, helium, or radon?
Climate scientists believe that human-caused global warming will increase the risk of skin cancer in human beings. True or False?
The answer to the first question is “false”; if it were true, your glass of Coke would overflow as the ice cubes melted. It’s icecaps on land, such as Greenland and Antarctica, that raise sea levels when they melt. Believers in human-made climate change scored no better on tests of climate science, or of science literacy in general, than deniers. Many believers think, for example, that global warming is caused by a hole in the ozone layer and that it can be mitigated by cleaning up toxic waste dumps.13 What predicts the denial of human-made climate change is not scientific illiteracy but political ideology. In 2015, 10 percent of conservative Republicans agreed that the Earth is getting warmer because of human activity
(57 percent denied that the Earth is getting warmer at all), compared with 36 percent of moderate Republicans, 53 percent of Independents, 63 percent of moderate Democrats, and 78 percent of liberal Democrats.14
In a revolutionary analysis of reason in the public sphere, the legal scholar Dan Kahan has argued that certain beliefs become symbols of cultural allegiance. People affirm or deny these beliefs to express not what they know but who they are.15 We all identify with particular tribes or subcultures, each of which embraces a creed on what makes for a good life and how society should run its affairs. These creeds tend to vary along two dimensions. One contrasts a right-wing comfort with natural hierarchy with a left-wing preference for forced egalitarianism (measured by agreement with statements like “We need to dramatically reduce inequalities between the rich and the poor, whites and people of color, and men and women”). The other is a libertarian affinity to individualism versus a communitarian or authoritarian affinity to solidarity (measured by agreement with statements like “Government should put limits on the choices individuals can make so they don’t get in the way of what’s good for society”). A given belief, depending on how it is framed and who endorses it, can become a touchstone, password, motto, shibboleth, sacred value, or oath of allegiance to one of these tribes. As Kahan and his collaborators explain:
The principal reason people disagree about climate change science is not that it has been communicated to them in forms they cannot understand. Rather, it is that positions on climate change convey values—communal concern versus individual self-reliance; prudent self-abnegation versus the heroic pursuit of reward; humility versus ingenuity; harmony with nature versus mastery over it—that divide them along cultural lines.16
The values that divide people are also defined by which demons are blamed for society’s misfortunes: greedy corporations, out-of-touch elites, meddling bureaucrats, lying politicians, ignorant rednecks, or, all too often, ethnic minorities.
Kahan notes that people’s tendency to treat their beliefs as oaths of allegiance rather than disinterested appraisals is, in one sense, rational. With the exception of a tiny number of movers, shakers, and deciders, a person’s opinions on climate change or evolution are astronomically unlikely to make a difference to the world at large. But they make an enormous difference to the respect the person commands in his or her social circle. To express the wrong opinion on a politicized issue can make one an oddball at best—someone who “doesn’t get it”—and a traitor at worst. The pressure to conform becomes all the greater as people live and work with others who are like them and as academic, business, or religious cliques brand themselves with left-wing or right-wing causes. For pundits and politicians with a reputation for championing their faction, coming out on the wrong side of an issue would be career suicide.
Given these payoffs, endorsing a belief that hasn’t passed muster with science and fact-checking isn’t so irrational after all—at least, not by the criterion of the immediate effects on the believer. The effects on the society and planet are another matter. The atmosphere doesn’t care what people think about it, and if it in fact warms by 4° Celsius, billions of people will suffer, no matter how many of them had been esteemed in their peer groups for holding the locally fashionable opinion on climate change along the way. Kahan concludes that we are all actors in a Tragedy of the Belief Commons: what’s rational for every individual to believe (based on esteem) can be irrational for the society as a whole to act upon (based on reality).17
The perverse incentives behind “expressive rationality” or “identity-protective cognition” help explain the paradox of 21st-century irrationality. During the 2016 presidential campaign, many political observers were incredulous at opinions expressed by Trump supporters (and in many cases by Trump himself), such as that Hillary Clinton had multiple sclerosis and was concealing it with a body double, or that Barack Obama must have had a role in 9/11 because he was never in the Oval Office around that time (Obama, of course, was not the president in 2001). As Amanda Marcotte put it, “These folks clearly are competent enough to dress themselves, read the address of the rally and show up on time, and somehow they continue to believe stuff that’s so crazy and so false that it’s impossible to believe anyone that isn’t barking mad could believe it. What’s going on?”18 What’s going on is that these people are sharing blue lies. A white lie is told for the benefit of the hearer; a blue lie is told for the benefit of an in-group (originally, fellow police officers).19 While some of the conspiracy theorists may be genuinely misinformed, most express these beliefs for the purpose of performance rather than truth: they are trying to antagonize liberals and display solidarity with their blood brothers. The anthropologist John Tooby adds that preposterous beliefs are more effective signals of coalitional loyalty than reasonable ones.20 Anyone can say that rocks fall down rather than up, but only a person who is truly committed to the brethren has a reason to say that God is three persons but also one person, or that the Democratic Party ran a child sex ring out of a Washington pizzeria.
* * *
The conspiracy theories of fervid hordes at a political rally represent an extreme case of self-expression trumping truth, but the Tragedy of the Belief Commons runs even deeper. Another paradox of rationality is that expertise, brainpower, and conscious reasoning do not, by themselves, guarantee that thinkers will approach the truth. On the contrary, they can be weapons for ever-more-ingenious rationalization. As Benjamin Franklin observed, “So convenient a thing is it to be a rational creature, since it enables us to find or make a reason for everything one has a mind to.”
Psychologists have long known that the human brain is infected with motivated reasoning (directing an argument toward a favored conclusion, rather than following it where it leads), biased evaluation (finding fault with evidence that disconfirms a favored position and giving a pass to evidence that supports it), and a My-Side bias (self-explanatory).21 In a classic experiment from 1954, the psychologists Al Hastorf and Hadley Cantril quizzed Dartmouth and Princeton students about a film of a recent bone-crushing, penalty-filled football game between the two schools, and found that each set of students saw more infractions by the other team.22
We know today that political partisanship is like sports fandom: testosterone levels rise or fall on election night just as they do on Super Bowl Sunday.23 And so it should not be surprising that political partisans—which include most of us—always see more infractions by the other team. In another classic study, the psychologists Charles Lord, Lee Ross, and Mark Lepper presented proponents and opponents of the death penalty with a pair of studies, one suggesting that capital punishment deterred homicide (murder rates went down the year after states adopted it), the other that it failed to do so (murder rates were higher in states that had capital punishment than in neighboring states that didn’t). The studies were fake but realistic, and the experimenters flipped the outcomes for half the participants just in case any of them found comparisons across time more convincing than comparisons across space or vice versa. The experimenters found that each group was momentarily swayed by the result they had just learned, but as soon as they had had a chance to read the details, they picked nits in whichever study was uncongenial to their starting position, saying things like “The evidence is meaningless without data about how the overall crime rate went up in those years,” or “There might be different circumstances between the two states even though they shared a border.” Thanks to this selective prosecution, the participants were more polarized after they had all been exposed to the same evidence than before: the antis were more anti, the pros more pro.24
Engagement with politics is like sports fandom in another way: people seek and consume news to enhance the fan experience, not to make their opinions more accurate.25 That explains another of Kahan’s findings: the better informed a person is about climate change, the more polarized his or her opinion.26 Indeed, people needn’t even have a prior opinion to be polarized by
the facts. When Kahan exposed people to a neutral, balanced presentation of the risks of nanotechnology (hardly a hot button on the cable news networks), they promptly split into factions that aligned with their views on nuclear power and genetically modified foods.27
If these studies aren’t sobering enough, consider this one, described by one magazine as “The Most Depressing Discovery About the Brain, Ever.”28 Kahan recruited a thousand Americans from all walks of life, assessed their politics and numeracy with standard questionnaires, and asked them to look at some data to evaluate the effectiveness of a new treatment for an ailment. The respondents were told that they had to pay close attention to the numbers, because the treatment was not expected to work a hundred percent of the time and might even make things worse, while sometimes the ailment got better on its own, without any treatment. The numbers had been jiggered so that one answer popped out (the treatment worked, because a larger number of treated people showed an improvement) but the other answer was correct (the treatment didn’t work, because a smaller proportion of the treated people showed an improvement). The knee-jerk answer could be overridden by a smidgen of mental math, namely eyeballing the ratios. In one version, the respondents were told that the ailment was a rash and the treatment was a skin cream. Here are the numbers they were shown:
Improved
Got Worse
Treatment
223
75
No Treatment
107
21
The data implied that the skin cream did more harm than good: the people who used it improved at a ratio of around three to one, while those not using it improved at a ratio of around five to one. (With half the respondents, the rows were flipped, implying that the skin cream did work.) The more innumerate respondents were seduced by the larger absolute number of treated people who got better (223 versus 107) and picked the wrong answer. The highly numerate respondents zoomed in on the difference between the two ratios (3:1 versus 5:1) and picked the right one. The numerate respondents, of course, were not biased for or against skin cream: whichever way the data went, they spotted the difference. And contrary to liberal Democrats’ and conservative Republicans’ worst suspicions about each other’s intelligence, neither faction did substantially better than the other.