Enlightenment Now
Eugenics is another movement that has been used as an ideological blunderbuss. Francis Galton, a Victorian polymath, first suggested that the genetic stock of humankind could be improved by offering incentives for talented people to marry each other and have more children (positive eugenics), though when the idea caught on it was extended to discouraging reproduction among the “unfit” (negative eugenics). Many countries forcibly sterilized delinquents, the mentally retarded, the mentally ill, and other people who fell into a wide net of ailments and stigmas. Nazi Germany modeled its forced sterilization laws after ones in Scandinavia and the United States, and its mass murder of Jews, Roma, and homosexuals is often considered a logical extension of negative eugenics. (In reality the Nazis invoked public health far more than genetics or evolution: Jews were likened to vermin, pathogens, tumors, gangrenous organs, and poisoned blood.)38
The eugenics movement was permanently discredited by its association with Nazism. But the term survived as a way to taint a number of scientific endeavors, such as applications of medical genetics that allow parents to bear children without fatal degenerative diseases, and to the entire field of behavioral genetics, which analyzes the genetic and environmental causes of individual differences.39 And in defiance of the historical record, eugenics is often portrayed as a movement of right-wing scientists. In fact it was championed by progressives, liberals, and socialists, including Theodore Roosevelt, H. G. Wells, Emma Goldman, George Bernard Shaw, Harold Laski, John Maynard Keynes, Sidney and Beatrice Webb, Woodrow Wilson, and Margaret Sanger.40 Eugenics, after all, valorized reform over the status quo, social responsibility over selfishness, and central planning over laissez-faire. The most decisive repudiation of eugenics invokes classical liberal and libertarian principles: government is not an omnipotent ruler over human existence but an institution with circumscribed powers, and perfecting the genetic makeup of the species is not among them.
I’ve mentioned the limited role of science in these movements not to absolve the scientists (many of whom were indeed active or complicit) but because the movements deserve a deeper and more contextualized understanding than their current role as anti-science propaganda. Misunderstandings of Darwin gave these movements a boost, but they sprang from the religious, artistic, intellectual, and political beliefs of their eras: Romanticism, cultural pessimism, progress as dialectical struggle or mystical unfolding, and authoritarian high modernism. If we think these ideas are not just unfashionable but mistaken, it is because of the better historical and scientific understanding we enjoy today.
* * *
Recriminations over the nature of science are by no means a relic of the “science wars” of the 1980s and 1990s, but continue to shape the role of science in universities. When Harvard reformed its general education requirement in 2006–7, the preliminary task force report introduced the teaching of science without any mention of its place in human knowledge: “Science and technology directly affect our students in many ways, both positive and negative: they have led to life-saving medicines, the internet, more efficient energy storage, and digital entertainment; they also have shepherded nuclear weapons, biological warfare agents, electronic eavesdropping, and damage to the environment.” Well, yes, and I suppose one could say that architecture has produced both museums and gas chambers, that classical music both stimulates economic activity and inspired the Nazis, and so on. But this strange equivocation between the utilitarian and the nefarious was not applied to other disciplines, and the statement gave no indication that we might have good reasons to prefer understanding and know-how to ignorance and superstition.
At a recent conference, another colleague summed up what she thought was the mixed legacy of science: vaccines for smallpox on the one hand; the Tuskegee syphilis study on the other. In that affair, another bloody shirt in the standard narrative about the evils of science, public health researchers, beginning in 1932, tracked the progression of untreated latent syphilis in a sample of impoverished African Americans for four decades. The study was patently unethical by today’s standards, though it’s often misreported to pile up the indictment. The researchers, many of them African American or advocates of African American health and well-being, did not infect the participants, as many people believe (a misconception that has led to the widespread conspiracy theory that AIDS was invented in US government labs to control the black population). And when the study began, it may even have been defensible by the standards of the day: treatments for syphilis (mainly arsenic) were toxic and ineffective; when antibiotics became available later, their safety and efficacy in treating syphilis were unknown; and latent syphilis was known to often resolve itself without treatment.41 But the point is that the entire equation is morally obtuse, showing the power of Second Culture talking points to scramble a sense of proportionality. My colleague’s comparison assumed that the Tuskegee study was an unavoidable part of scientific practice as opposed to a universally deplored breach, and it equated a one-time failure to prevent harm to a few dozen people with the prevention of hundreds of millions of deaths per century in perpetuity.
Does the demonization of science in the liberal arts programs of higher education matter? It does, for a number of reasons. Though many talented students hurtle along pre-med or engineering tracks from the day they set foot on campus, many others are unsure of what they want to do with their lives and take their cues from their professors and advisors. What happens to those who are taught that science is just another narrative like religion and myth, that it lurches from revolution to revolution without making progress, and that it is a rationalization of racism, sexism, and genocide? I’ve seen the answer: some of them figure, “If that’s what science is, I might as well make money!” Four years later their brainpower is applied to thinking up algorithms that allow hedge funds to act on financial information a few milliseconds faster rather than to finding new treatments for Alzheimer’s disease or technologies for carbon capture and storage.
The stigmatization of science is also jeopardizing the progress of science itself. Today anyone who wants to do research on human beings, even an interview on political opinions or a questionnaire about irregular verbs, must prove to a committee that he or she is not Josef Mengele. Though research subjects obviously must be protected from exploitation and harm, the institutional review bureaucracy has swollen far beyond this mission. Critics have pointed out that it has become a menace to free speech, a weapon that fanatics can use to shut up people whose opinions they don’t like, and a red-tape dispenser which bogs down research while failing to protect, and sometimes harming, patients and research subjects.42 Jonathan Moss, a medical researcher who had developed a new class of drugs and was drafted into chairing the research review board at the University of Chicago, said in a convocation address, “I ask you to consider three medical miracles we take for granted: X-rays, cardiac catheterization, and general anesthesia. I contend all three would be stillborn if we tried to deliver them in 2005.”43 (The same observation has been made about insulin, burn treatments, and other lifesavers.) The social sciences face similar hurdles. Anyone who talks to a human being with the intent of gaining generalizable knowledge must obtain prior permission from these committees, almost certainly in violation of the First Amendment. Anthropologists are forbidden to speak with illiterate peasants who cannot sign a consent form, or interview would-be suicide bombers on the off chance that they might blurt out information that puts them in jeopardy.44
The hobbling of research is not just a symptom of bureaucratic mission creep. It is actually rationalized by many academics in a field called bioethics. These theoreticians think up reasons why informed and consenting adults should be forbidden to take part in treatments that help them and others while harming no one, using nebulous rubrics like “dignity,” “sacredness,” and “social justice.” They try to sow panic about advances in biomedical research using far-fetched analogies with nuclear weapons and Nazi atrocities, science-fiction dysto
pias like Brave New World and Gattaca, and freak-show scenarios like armies of cloned Hitlers, people selling their eyeballs on eBay, or warehouses of zombies to supply people with spare organs. The moral philosopher Julian Savulescu has exposed the low standards of reasoning behind these arguments and has pointed out why “bioethical” obstructionism can be unethical: “To delay by 1 year the development of a treatment that cures a lethal disease that kills 100,000 people per year is to be responsible for the deaths of those 100,000 people, even if you never see them.”45
* * *
Ultimately the greatest payoff of instilling an appreciation of science is for everyone to think more scientifically. We saw in the preceding chapter that humans are vulnerable to cognitive biases and fallacies. Though scientific literacy itself is not a cure for fallacious reasoning when it comes to politicized identity badges, most issues don’t start out that way, and everyone would be better off if they could think about them more scientifically. Movements that aim to spread scientific sophistication such as data journalism, Bayesian forecasting, evidence-based medicine and policy, real-time violence monitoring, and effective altruism have a vast potential to enhance human welfare. But an appreciation of their value has been slow to penetrate the culture.46
I asked my doctor whether the nutritional supplement he had recommended for my knee pain would really be effective. He replied, “Some of my patients say it works for them.” A business-school colleague shared this assessment of the corporate world: “I have observed many smart people who have little idea of how to logically think through a problem, who infer causation from a correlation, and who use anecdotes as evidence far beyond the predictability warranted.” Another colleague who quantifies war, peace, and human security describes the United Nations as an “evidence-free zone”:
The higher reaches of the UN are not unlike anti-science humanities programs. Most people at the top are lawyers and liberal arts graduates. The only parts of the Secretariat that have anything resembling a research culture have little prestige or influence. Few of the top officials in the UN understood qualifying statements as basic as “on average and other things being equal.” So if we were talking about risk probabilities for conflict onsets you could be sure that Sir Archibald Prendergast III or some other luminary would offer a dismissive, “It’s not like that in Burkina Faso, y’know.”
Resisters of scientific thinking often object that some things just can’t be quantified. Yet unless they are willing to speak only of issues that are black or white and to foreswear using the words more, less, better, and worse (and for that matter the suffix –er), they are making claims that are inherently quantitative. If they veto the possibility of putting numbers to them, they are saying, “Trust my intuition.” But if there’s one thing we know about cognition, it’s that people (including experts) are arrogantly overconfident about their intuition. In 1954 Paul Meehl stunned his fellow psychologists by showing that simple actuarial formulas outperform expert judgment in predicting psychiatric classifications, suicide attempts, school and job performance, lies, crime, medical diagnoses, and pretty much any other outcome in which accuracy can be judged at all. Meehl’s work inspired Tversky and Kahneman’s discoveries on cognitive biases and Tetlock’s forecasting tournaments, and his conclusion about the superiority of statistical to intuitive judgment is now recognized as one of the most robust findings in the history of psychology.47
Like all good things, data are not a panacea, a silver bullet, a magic bullet, or a one-size-fits-all solution. All the money in the world could not pay for randomized controlled trials to settle every question that occurs to us. Human beings will always be in the loop to decide which data to gather and how to analyze and interpret them. The first attempts to quantify a concept are always crude, and even the best ones allow probabilistic rather than perfect understanding. Nonetheless, quantitative social scientists have laid out criteria for evaluating and improving measurements, and the critical comparison is not whether a measure is perfect but whether it is better than the judgment of an expert, critic, interviewer, clinician, judge, or maven. That turns out to be a low bar.
Because the cultures of politics and journalism are largely innocent of the scientific mindset, questions with massive consequences for life and death are answered by methods that we know lead to error, such as anecdotes, headlines, rhetoric, and what engineers call HiPPO (highest-paid person’s opinion). We have already seen some dangerous misconceptions that arise from this statistical obtuseness. People think that crime and war are spinning out of control, though homicides and battle deaths are going down, not up. They think that Islamist terrorism is a major risk to life and limb, whereas the danger is smaller than that from wasps and bees. They think that ISIS threatens the existence or survival of the United States, whereas terrorist movements rarely achieve any of their strategic aims.
The dataphobic mindset (“It’s not like that in Burkina Faso”) can lead to real tragedy. Many political commentators can recall a failure of peacekeeping forces (such as in Bosnia in 1995) and conclude that they are a waste of money and manpower. But when a peacekeeping force is successful, nothing photogenic happens, and it fails to make the news. In her book Does Peacekeeping Work? the political scientist Virginia Page Fortna addressed the question in her title with the methods of science rather than headlines, and, in defiance of Betteridge’s Law, found that the answer is “a clear and resounding yes.” Other studies have come to the same conclusion.48 Knowing the results of these analyses could make the difference between an international organization helping to bring peace to a country and letting it fester in civil war.
Do multiethnic regions harbor “ancient hatreds” that can only be tamed by partitioning them into ethnic enclaves and cleansing the minorities from each one? Whenever ethnic neighbors go for each other’s throats we read about it, but what about the neighborhoods that never make the news because they live in boring peace? What proportion of pairs of ethnic neighbors coexist without violence? The answer is, most of them: 95 percent of the neighbors in the former Soviet Union, 99 percent of those in Africa.49
Do campaigns of nonviolent resistance work? Many people believe that Gandhi and Martin Luther King just got lucky: their movements tugged at the heartstrings of enlightened democracies at opportune moments, but everywhere else, oppressed people need violence to get out from under a dictator’s boot. The political scientists Erica Chenoweth and Maria Stephan assembled a dataset of political resistance movements across the world between 1900 and 2006 and discovered that three-quarters of the nonviolent resistance movements succeeded, compared with only a third of the violent ones.50 Gandhi and King were right, but without data, you would never know it.
Though the urge to join a violent insurgent or terrorist group may owe more to male bonding than to just-war theory, most of the combatants probably believe that if they want to bring about a better world, they have no choice but to kill people. What would happen if everyone knew that violent strategies were not just immoral but ineffectual? It’s not that I think we should airdrop crates of Chenoweth and Stephan’s book into conflict zones. But leaders of radical groups are often highly educated (they distill their frenzy from academic scribblers of a few years back), and even the cannon fodder often attend some college and absorb the conventional wisdom about the need for revolutionary violence.51 What would happen over the long run if a standard college curriculum devoted less attention to the writings of Karl Marx and Frantz Fanon and more to quantitative analyses of political violence?
* * *
One of the greatest potential contributions of modern science may be a deeper integration with its academic partner, the humanities. By all accounts, the humanities are in trouble. University programs are downsizing; the next generation of scholars is un- or underemployed; morale is sinking; students are staying away in droves.52
No thinking person should be indifferent to our society’s disinvestment in the humanities.53
A society without historical scholarship is like a person without memory: deluded, confused, easily exploited. Philosophy grows out of the recognition that clarity and logic don’t come easily to us and that we’re better off when our thinking is refined and deepened. The arts are one of the things that make life worth living, enriching human experience with beauty and insight. Criticism is itself an art that multiplies the appreciation and enjoyment of great works. Knowledge in these domains is hard won, and needs constant enriching and updating as the times change.