Among the grassroots, attitudes all over the world will almost certainly ensure that women will gain greater economic and political representation in the coming years. A 2010 survey by the Pew Research Center Global Attitudes Project of twenty-two countries found that in most of them, at least 90 percent of the respondents of both sexes believe that women should have equal rights, including the United States, China, India, Japan, South Korea, Turkey, Lebanon, and countries in Europe and Latin America. Even in Egypt, Jordan, Indonesia, Pakistan, and Kenya, more than 60 percent favor equal rights; only in Nigeria does the proportion fall just short of half.98 Support for women being allowed to work outside the home is even higher. And recall the global Gallup survey that showed that even in Islamic countries a majority of women believe that women should be able to vote as they please, work at any job, and serve in government, and that in most of the countries, a majority of the men agreed.99 As this pent-up demand is released, the interests of women are bound to be given greater consideration in their countries’ policies and norms. The argument that women should not be assaulted by the men in their lives is irrefutable, and as Victor Hugo noted, “There is nothing more powerful than an idea whose time has come.”
CHILDREN’S RIGHTS AND THE DECLINE OF INFANTICIDE, SPANKING, CHILD ABUSE, AND BULLYING
What do Moses, Ishmael, Romulus and Remus, Oedipus, Cyrus the Great, Sargon, Gilgamesh, and Hou Chi (a founder of the Chou Dynasty) have in common? They were all exposed as infants—abandoned by their parents and left to the elements.100 The image of a helpless baby dying alone of cold, hunger, and predation is a potent tug on the heartstrings, so it is not surprising that a rise from infant exposure to dynastic greatness found its way into the mythologies of Jewish, Muslim, Roman, Greek, Persian, Akkadian, Sumerian, and Chinese civilizations. But the ubiquity of the exposure archetype is not just a lesson in what makes for a good story arc. It is also a lesson on how common infanticide was in human history. From time immemorial, parents have abandoned, smothered, strangled, beaten, drowned, or poisoned many of their newborns.101
A survey of cultures by the anthropologist Laila Williamson reveals that infanticide has been practiced on every continent and by every kind of society, from nonstate bands and villages (77 percent of which have an accepted custom of infanticide) to advanced civilizations.102 Until recently, between 10 and 15 percent of all babies were killed shortly after they were born, and in some societies the rate has been as high as 50 percent.103 In the words of the historian Lloyd deMause, “All families once practiced infanticide. All states trace their origin to child sacrifice. All religions began with the mutilation and murder of children.”104
Though infanticide is the most extreme form of maltreatment of children, our cultural heritage tells of many others, including the sacrifice of children to gods; the sale of children into slavery, marriage, and religious servitude; the exploitation of children to clean chimneys and crawl through tunnels in coal mines; and the subjection of children to forms of corporal punishment that verge on or cross over into torture.105 We have come a long way to arrive at an age in which one-pound preemies are rescued with heroic surgery, children are not expected to be economically productive until their fourth decade, and violence against children has been defined down to dodgeball.
How can we make sense of something that runs as contrary to the continuation of life as killing a newborn? In the concluding chapter of Hardness of Heart/Hardness of Life, his magisterial survey of infanticide around the world, the physician Larry Milner makes a confession:I began this book with one purpose in mind—to understand, as stated in the Introduction: “How someone can take their own child, and strangle it to death?” When I first raised the question many years ago, I thought the issue to be suggestive of some unique pathologic alteration of Nature’s way. It did not seem rational that evolution would maintain an inherited tendency to kill one’s offspring when survival was already in such a delicate balance. Darwinian natural selection of genetic material meant that only the survival of the fittest was guaranteed; a tendency toward infanticide must certainly be a sign of unfit behavior that would not pass this reasonable standard. But the answer which has emerged from my research indicates that one of the most “natural” things a human being can do is voluntarily kill its own offspring when faced with a variety of stressful situations.106
The solution to Milner’s puzzlement lies in the subfield of evolutionary biology called life history theory.107 The intuition that a mother should treat every offspring as infinitely precious, far from being an implication of the theory of natural selection, is incompatible with it. Selection acts to maximize an organism’s expected lifetime reproductive output, and that requires that it negotiate the tradeoff between investing in a new offspring and conserving its resources for current and future offspring. Mammals are extreme among animals in the amount of time, energy, and food they invest in their young, and humans are extreme among mammals. Pregnancy and birth are only the first chapter in a mother’s investment career, and a mammalian mother faces an expenditure of more calories in suckling the offspring to maturity than she expended in bearing it.108 Nature generally abhors the sunk-cost fallacy, and so we expect mothers to assess the offspring and the circumstances to decide whether to commit themselves to the additional investment or to conserve their energy for its born or unborn siblings.109 If a newborn is sickly, or if the situation is unpromising for its survival, they do not throw good money after bad but cut their losses and favor the healthiest in the litter or wait until times get better and they can try again.
To a biologist, human infanticide is an example of this triage.110 Until recently, women nursed their children for two to four years before returning to full fertility. Many children died, especially in the perilous first year. Most women saw no more than two or three of their children survive to adulthood, and many did not see any survive. To become a grandmother in the unforgiving environment of our evolutionary ancestors, a woman would have had to make hard choices. The triage theory predicts that a mother would let a newborn die when its prospects for survival to adulthood were poor. The forecast may be based on bad signs in the infant, such as being deformed or unresponsive, or bad signs for successful motherhood, such as being burdened with older children, beset by war or famine, or unable to count on support from relatives or the baby’s father. It should also depend on whether she is young enough to have opportunities to try again.
Martin Daly and Margo Wilson tested the triage theory by examining a sample of sixty unrelated societies from a database of ethnographies.111 Infanticide was documented in a majority of them, and in 112 cases the anthropologists recorded a reason. Eighty-seven percent of the reasons fit the triage theory: the infant was not sired by the woman’s husband, the infant was deformed or ill, or the infant had strikes against its chances of surviving to maturity, such as being a twin, having an older sibling close in age, having no father around, or being born into a family that had fallen on hard economic times.
The ubiquity and evolutionary intelligibility of infanticide suggest that for all its apparent inhumanity, it is usually not a form of wanton murder but falls into a special category of violence. Anthropologists who interview these women (or their relatives, since the event may be too painful for the woman to discuss) often recount that the mother saw the death as an unavoidable tragedy and grieved for the lost child. Napoleon Chagnon, for example, wrote of the wife of a Yanomamö headman, “Bahami was pregnant when I began my fieldwork, but she destroyed the infant when it was born—a boy in this case—explaining tearfully that she had no choice. The new baby would have competed with Ariwari, her youngest child, who was still nursing. Rather than expose Ariwari to the dangers and uncertainty of an early weaning, she chose to terminate the newborn instead.”112 Though the Yanomamö are the so-called fierce people, infanticide is not necessarily a manifestation of fierceness across the board. Some warring tribes, particularly in Africa, rarely kill their newborns, while some relatively pe
aceful ones kill them regularly.113 The title of Milner’s magnum opus comes from a quotation from a 19th-century founder of anthropology, Edward Tylor, who wrote, “Infanticide arises from hardness of life rather than hardness of heart.”114
The fateful tipping point between keeping and sacrificing a newborn is set both by internal emotions and by cultural norms. In a culture such as ours that reveres birth and takes every step to allow babies to thrive, we tend to think that joyful bonding between mother and newborn is close to reflexive. But in fact it requires overcoming considerable psychological obstacles. In the 1st century CE, Plutarch pointed out an uncomfortable truth:There is nothing so imperfect, so helpless, so naked, so shapeless, so foul, as man observed at birth, to whom alone, one might almost say, Nature has given not even a clean passage to the light; but, defiled with blood and covered with filth, and resembling more one just slain than one just born, he is an object for none to touch or lift up or kiss or embrace except for someone who loves with a natural affection.115
The “natural affection” is far from automatic. Daly and Wilson, and later the anthropologist Edward Hagen, have proposed that postpartum depression and its milder version, the baby blues, are not a hormonal malfunction but the emotional implementation of the decision period for keeping a child.116 Mothers with postpartum depression often feel emotionally detached from their newborns and may harbor intrusive thoughts of harming them. Mild depression, psychologists have found, often gives people a more accurate appraisal of their life prospects than the rose-tinted view we normally enjoy. The typical rumination of a depressed new mother—how will I cope with this burden?—has been a legitimate question for mothers throughout history who faced the weighty choice between a definite tragedy now and the possibility of an even greater tragedy later. As the situation becomes manageable and the blues dissipate, many women report falling in love with their baby, coming to see it as a uniquely wonderful individual.
Hagen examined the psychiatric literature on postpartum depression to test five predictions of the theory that it is an evaluation period for investing in a newborn. As predicted, postpartum depression is more common in women who lack social support (they are single, separated, dissatisfied with their marriage, or distant from their parents), who had had a complicated delivery or an unhealthy infant, and who were unemployed or whose husbands were unemployed. He found reports of postpartum depression in a number of non-Western populations which showed the same risk factors (though he could not find enough suitable studies of traditional kin-based societies). Finally, postpartum depression is only loosely tied to measured hormonal imbalances, suggesting that it is not a malfunction but a design feature.
Many cultural traditions work to distance people’s emotions from a newborn until its survival seems likely. People may be enjoined from touching, naming, or granting legal personhood to a baby until a danger period is over, and the transition is often marked by a joyful ceremony, as in our own customs of the christening and the bris.117 Some traditions have a series of milestones, such as traditional Judaism, which grants full legal personhood to a baby only after it has survived thirty days.
If I have tried to make infanticide a bit more comprehensible, it is only to reduce the distance between the vast history in which it was accepted and our contemporary sensibilities in which it is abhorrent. But the chasm that separates them is wide. Even when we acknowledge the harsh evolutionary logic that applies to the hard lives of premodern peoples, many of their infanticides are, by our standards, hard to comprehend and impossible to forgive. Examples from Daly and Wilson’s list include the killing of a newborn conceived in adultery, and the killing of all a woman’s children from a previous marriage when she takes (or is abducted by) a new husband. And then there are the 14 percent of the infanticidal justifications on the list that, as Daly and Wilson point out, do not easily fall into categories that an evolutionary biologist would have predicted beforehand. They include child sacrifice, an act of spite by a grandfather against his son-in-law, filicides that are committed to eliminate claimants to a throne or to avoid the obligations of kinship customs, and most commonly, the killing of a newborn for no other reason than that she is a girl.
Female infanticide has been put on the world’s agenda today by census data revealing a massive shortage of women in the developing world. “A hundred million missing” is the commonly cited statistic for the daughter shortfall, a majority of them in China and India.118 Many Asian families have a morbid preference for sons. In some countries a pregnant woman can walk into an amniocentesis or ultrasound clinic, and if she learns she is carrying a girl, she can walk next door to an abortion clinic. The technological efficiency of daughter-proofing a pregnancy may make it seem as if the girl shortage is a problem of modernity, but female infanticide has been documented in China and India for more than two thousand years.119 In China, midwives kept a bucket of water at the bedside to drown the baby if it was a girl. In India there were many methods: “giving a pill of tobacco and bhang to swallow, drowning in milk, smearing the mother’s breast with opium or the juice of the poisonous Datura, or covering the child’s mouth with a plaster of cow-dung before it drew breath.” Then and now, even when daughters are suffered to live, they may not last long. Parents allocate most of the available food to their sons, and as a Chinese doctor explains, “if a boy gets sick, the parents may send him to the hospital at once, but if a girl gets sick, the parents may say to themselves, ‘Well, we’ll see how she is tomorrow.’ ”120
Female infanticide, also called gendercide and gynecide, is not unique to Asia.121 The Yanomamö are one of many foraging peoples that kill more newborn daughters than sons. In ancient Greece and Rome, babies were “discarded in rivers, dunghills, or cesspools, placed in jars to starve, or exposed to the elements and beasts in the wild.”122 Infanticide was also common in medieval and Renaissance Europe.123 In all these places, more girls perished than boys. Often families would kill every daughter born to them until they had a son; subsequent daughters were allowed to live.
Female infanticide is biologically mysterious. Every child has a mother and a father, so if people are concerned about posterity, be it for their genes or their dynasty, culling their own daughters is a form of madness. A basic principle of evolutionary biology is that a fifty-fifty sex ratio at sexual maturity is a stable equilibrium in a population, because if males ever predominated, daughters would be in demand and would have an advantage over sons in attracting partners and contributing children to the next generation. And so it would be for sons if females ever predominated. To the extent that parents can control the sex ratio of their surviving offspring, whether by nature or by nurture, posterity should punish them for favoring sons or daughters across the board.124
One naïve hypothesis comes out of the realization that it is the number of females in a population that determines how rapidly it will grow. Perhaps tribes or nations that have multiplied themselves to the Malthusian limit on food or land kill their daughters to achieve zero population growth.125 One problem for the ZPG theory, however, is that many infanticidal tribes and civilizations were not environmentally stressed. A more serious problem is that it has the fatal flaw of all naïve good-of-the-group theories, namely that the mechanism it proposes is self-undermining. Any family that cheated on the policy and kept its daughters alive would take over the population, stocking it with their grandchildren while the excess bachelor sons of their altruistic neighbors died without issue. The lineages that were inclined to kill their newborn daughters would have died out long ago, and the persistence of female infanticide in any society would be a mystery.
Can evolutionary psychology explain the gender bias? Critics of that approach say that it is merely an exercise in creativity, since one can always come up with an ingenious evolutionary explanation for any phenomenon. But that is an illusion, arising from the fact that so many ingenious evolutionary hypotheses have turned out to be confirmed by the data. Such success is far from guaranteed. One
prominent hypothesis that, for all its ingenuity, turned out to be false was the application of the Trivers-Willard theory of sex ratios to female infanticide in humans.126
The biologist Robert Trivers and the mathematician Dan Willard reasoned that even though sons and daughters are expected to yield the same number of grandchildren on average, the maximum number that each sex can promise is different. A superfit son can outcompete other males and impregnate any number of women and thereby have any number of children, whereas a superfit daughter can have no more than the maximum she can bear and nurture in her reproductive career. On the other hand a daughter is a safer bet—an unfit son will lose the competition with other men and end up childless, whereas an unfit daughter almost never lacks for a willing sex partner. It’s not that her fitness is irrelevant—a healthy and desirable daughter will still have more surviving children than an unhealthy and undesirable one—but the difference is not as extreme as it is for boom-or-bust sons. To the extent that parents can predict the fitness of their children (say, by monitoring their own health, nutrition, or territory) and strategically tilt the sex ratio, they should favor sons when they are in better shape than the competition, and favor daughters when they are in worse shape.
The Trivers-Willard theory has been confirmed in many nonhuman species and even, a bit, in Homo sapiens. In traditional societies, richer and higher-status people tend to live longer and attract more and better mates, so the theory predicts that higher-status people should favor sons and lower-status people should favor daughters. In some kinds of favoritism (like bequests in wills), that is exactly what happens.127 But with a very important kind of favoritism—allowing a newborn to live—the theory doesn’t work so well. The evolutionary anthropologists Sarah Hrdy and Kristen Hawkes have each shown that the Trivers-Willard theory gets only half of the story right. In India, it’s true that the higher castes tend to kill their daughters. Unfortunately, it’s not true that the lower castes tend to kill their sons. In fact, it’s hard to find a society anywhere that kills its sons.128 The infanticidal cultures of the world are either equal-opportunity baby-killers or they prefer to kill the girls—and with them, the Trivers-Willard explanation for female infanticide in humans.