If the nongenetic component of personality is the outcome of neurodevelopmental roulette, it would present us with two surprises. One is that just as the “genetic” term in the behavioral geneticist’s equation is not necessarily genetic, the “environmental” term is not necessarily environmental. If the unexplained variance is a product of chance events in brain assembly, yet another chunk of our personalities would be “biologically determined” (though not genetic) and beyond the scope of the best-laid plans of parents and society.
The other surprise is that we may have to make room for a pre-scientific explanatory concept in our view of human nature—not free will, as many people have suggested to me, but fate. It is not free will because among the traits that may differ between identical twins reared together are ones that are stubbornly involuntary. No one chooses to become schizophrenic, homosexual, musically gifted, or, for that matter, anxious or self-confident or open to experience. But the old idea of fate—in the sense of uncontrollable fortune, not strict predestination—can be reconciled with modern biology once we remember the many openings for chance to operate in development. Harris, noting how recent and parochial is the belief that we can shape our children, quotes a woman living in a remote village of India in the 1950s. When asked what kind of man she hoped her child would grow into, she shrugged and replied, “It is in his fate, no matter what I want.”70
NOT EVERYONE IS SO accepting of fate, or of the other forces beyond a parent’s control, like genes and peers. “I hope to God this isn’t true,” one mother said to the Chicago Tribune. “ The thought that all this love that I’m pouring into him counts for nothing is too terrible to contemplate.”71 As with other discoveries about human nature, people hope to God it isn’t true. But the truth doesn’t care about our hopes, and sometimes it can force us to revisit those hopes in a liberating way.
Yes, it is disappointing that there is no algorithm for growing a happy and successful child. But would we really want to specify the traits of our children in advance, and never be delighted by the unpredictable gifts and quirks that every child brings into the world? People are appalled by human cloning and its dubious promise that parents can design their children by genetic engineering. But how different is that from the fantasy that parents can design their children by how they bring them up? Realistic parents would be less anxious parents. They could enjoy their time with their children rather than constantly trying to stimulate them, socialize them, and improve their characters. They could read stories to their children for the pleasure of it, not because it’s good for their neurons.
Many critics accuse Harris of trying to absolve parents of responsibility for their children’s lives: if the kids turn out badly, parents can say it’s not their fault. But by the same token she is assigning adults responsibility for their own lives: if your life is not going well, stop moaning that it’s all your parents’ fault. She is rescuing mothers from fatuous theories that blame them for every misfortune that befalls their children, and from the censorious know-it-alls who make them feel like ogres if they slip out of the house to work or skip a reading of Goodnight Moon. And the theory assigns us all a collective responsibility for the health of the neighborhoods and culture in which peer groups are embedded.
Finally: “So you’re saying it doesn’t matter how I treat my children?” What a question! Yes, of course it matters. Harris reminds her readers of the reasons.
First, parents wield enormous power over their children, and their actions can make a big difference to their happiness. Childrearing is above all an ethical responsibility. It is not OK for parents to beat, humiliate, deprive, or neglect their children, because those are awful things for a big strong person to do to a small helpless one. As Harris writes, “We may not hold their tomorrows in our hands but we surely hold their todays, and we have the power to make their todays very miserable.”72
Second, a parent and a child have a human relationship. No one ever asks, “So you’re saying it doesn’t matter how I treat my husband or wife?” even though no one but a newlywed believes that one can change the personality of one’s spouse. Husbands and wives are nice to each other (or should be) not to pound the other’s personality into a desired shape but to build a deep and satisfying relationship. Imagine being told that one cannot revamp the personality of a husband or wife and replying, “The thought that all this love I’m pouring into him (or her) counts for nothing is too terrible to contemplate.” So it is with parents and children: one person’s behavior toward another has consequences for the quality of the relationship between them. Over the course of a lifetime the balance of power shifts, and children, complete with memories of how they were treated, have a growing say in their dealings with their parents. As Harris puts it, “If you don’t think the moral imperative is a good enough reason to be nice to your kid, try this one: Be nice to your kid when he’s young so that he will be nice to you when you’re old.”73 There are well-functioning adults who still shake with rage when recounting the cruelties their parents inflicted on them as children. There are others who moisten up in private moments when recalling a kindness or sacrifice made for their happiness, perhaps one that the mother or father has long forgotten. If for no other reason, parents should treat their children well to allow them to grow up with such memories.
I have found that when people hear these explanations they lower their eyes and say, somewhat embarrassedly, “Yes. I knew that.” The fact that people can forget these simple truths when intellectualizing about children shows how far modern doctrines have taken us. They make it easy to think of children as lumps of putty to be shaped instead of partners in a human relationship. Even the theory that children adapt to their peer group becomes less surprising when we think of them as human beings like ourselves. “Peer group” is a patronizing term we use in connection with children for what we call “friends and colleagues and associates” when we talk about ourselves. We groan when children obsess over wearing the right kind of cargo pants, but we would be just as mortified if a very large person forced us to wear pink overalls to a corporate board meeting or a polyester disco suit to an academic conference. “Being socialized by a peer group” is another way of saying “living successfully within a society,” which for a social organism means “living.” It is children, above all, who are alleged to be blank slates, and that can make us forget they are people.
Chapter 20
The Arts
THE ARTS ARE in trouble. I didn’t say it; they did: the critics, scholars, and (as we now say) content providers who make their living in the arts and humanities. According to the theater director and critic Robert Brustein:
The possibility of sustaining high culture in our time is becoming increasingly problematical. Serious book stores are losing their franchise; small publishing houses are closing shop; little magazines are going out of business; nonprofit theaters are surviving primarily by commercializing their repertory; symphony orchestras are diluting their programs; public television is increasing its dependence on reruns of British sitcoms; classical radio stations are dwindling; museums are resorting to blockbuster shows; dance is dying.1
In recent years the higher-brow magazines and presses have been filled with similar laments. Here is a sample of titles:
The Death of Literature2 • The Decline and Fall of Literature3 • The Decline of High Culture4 • Have the Humanities Disciplines Collapsed?5 • The Humanities—At Twilight?6 • Humanities in the Age of Money7 • The Humanities’ Plight8 • Literature: An Embattled Profession9 • Literature Lost10 • Music’s Dying Fall11 • The Rise and Fall of English12 • What’s Happened to the Humanities?13 • Who Killed Culture?14
If we are to believe the pessimists, the decline has been going on for some time. In 1948 T. S. Eliot wrote, “We can assert with some confidence that our own period is one of decline; that the standards of culture are lower than they were fifty years ago; and that the evidences of this decline are visible in every department of human
activity.”15
Some of the vital signs of the arts and humanities are indeed poor. In 1997 the U.S. House of Representatives voted to kill the National Endowment for the Arts, and the Senate was able to save it only by cutting its budget nearly in half. Universities have disinvested in the humanities: since 1960, the proportion of faculty in liberal arts has fallen by half, salaries and working conditions have stagnated, and more and more teaching is done by graduate students and part-time faculty.16 New Ph.D.s are often unemployed or resigned to a life of one-year appointments. In many liberal arts colleges, humanities departments have been downsized, merged, or eliminated altogether.
One cause of the decline in academia is competition from the efflorescence of science and engineering. Another may be a surfeit of Ph.D.s pumped out by graduate programs that failed to practice academic birth control. But the problem is as much a reduction in the demand by students as an increase in the supply of professors. While the total number of bachelor’s degrees rose by almost 40 percent between 1970 and 1994, the number of degrees in English declined by 40 percent. It may get worse: only 9 percent of high school students today indicate an interest in majoring in the humanities.17 One university was so desperate to restore enrollment in its College of Arts and Sciences that it hired an advertising firm to come up with a “Think for a Living” campaign. Here are some of the slogans they came up with:
Do what you want when you graduate or wait 20 years for your midlife crisis.
Insurance for when the robots take over all the boring jobs.
Okay then. Follow your dreams in your next life.
Yeah, like your parents are so happy.
Careerism may explain the disenchantment some students feel with liberal arts, but not all of it. The economy is in better shape today than it was in periods in which the humanities were more popular, and many young people still do not shoot themselves from cannons into their careers but use their college years to enrich themselves in various ways. There is no good reason that the arts and humanities should not be able to compete for students’ attention during this interlude. A knowledge of culture, history, and ideas is still an asset in most professions, as it is in everyday life. But students stay away from the humanities anyway.
In this chapter I will diagnose the malaise of the arts and humanities and offer some suggestions for revitalizing them. They didn’t ask me, but by their own accounts they need all the help they can get, and I believe that part of the answer lies within the theme of this book. I will begin by circumscribing the problem.
As A MATTER of fact, the arts and humanities are not in trouble. According to recent assessments based on data from the National Endowment for the Arts and the Statistical Abstract of the United States, they have never been in better shape.18 In the past two decades, symphony orchestras, booksellers, libraries, and new independent films have all increased in number. Attendance is up, in some cases at record levels, at classical music concerts, live theater, opera performances, and art museums, as we see in blockbuster shows with long lines and scarce tickets. The number of books in print (including books of art, poetry, and drama) has exploded, as have book sales. Nor have people become passive consumers of art. The year 1997 broke records for the proportion of adults drawing, taking art photographs, buying art, and doing creative writing.
Advances in technology have made art more accessible than ever before. A couple of hours of minimum-wage income can buy any of tens of thousands of audiophile-quality musical recordings, including many versions of any classical work performed by the world’s great orchestras. Video stores allow people in the boondocks to arrange cheap private screenings of the great classics of cinema. Instead of the three television networks with their sitcoms, variety shows, and soaps, most Americans can now choose from a menu of fifty to a hundred stations, including ones that specialize in history, science, politics, and the arts. Inexpensive video equipment and streaming video on the World Wide Web are allowing independent filmmaking to flourish. Virtually any book in print is available within days to anyone with a credit card and a modem. On the Web one can find the text of all the major novels, poems, plays, and works of philosophy and scholarship that have fallen out of copyright, as well as virtual tours of the world’s great art museums. New intellectual e-zines and web sites have proliferated, and back issues are instantly available.
We are swimming in culture, drowning in it. So why all the lamentations about its plight, decline, fall, collapse, twilight, and death?
One response from the doomsayers is that the current frenzy of consumption involves past classics and current mediocrities but that few new works of quality are coming into the world. That is doubtful.19 As historians of the arts repeatedly tell us, all the supposed sins of contemporary culture—mass appeal, the profit motive, themes of sex and violence, and adaptations to popular formats (such as serialization in newspapers)—may be found in the great artists of past centuries. Even in recent decades, many artists were seen in their time as commercial hacks and only later attained artistic respectability. Examples include the Marx Brothers, Alfred Hitchcock, the Beatles, and, if we are to judge by recent museum shows and critical appreciations, even Norman Rockwell. There are dozens of excellent novelists from countries all over the world, and though most television and cinema is dreadful, the best can be very good indeed: Carla on Cheers was wittier than Dorothy Parker, and the plot of Tootsie is cleverer than the plots of any of Shakespeare’s cross-dressing comedies.
As for music, though it may be hard for anyone to compete against the best composers from the eighteenth and nineteenth centuries, the past century has been anything but barren. Jazz, Broadway, country, blues, folk, rock, soul, samba, reggae, world music, and contemporary composition have blossomed. Each has produced gifted artists and has introduced new complexities of rhythm, instrumentation, vocal style, and studio production into our total musical experience. Then there are genres that are flourishing as never before, such as animation and industrial design, and still others that have only recently come into existence but have already achieved moments of high accomplishment, such as computer graphics and rock videos (for instance, Peter Gabriel’s Sledgehammer).
In every era for thousands of years critics have bemoaned the decline of culture, and the economist Tyler Cowen suggests they are the victims of a cognitive illusion. The best works of art are more likely to appear in a past decade than in the present decade for the same reason that another line in the supermarket always moves faster than the one you are in: there are more of them. We get to enjoy the greatest hits winnowed from all those decades, listening to the Mozarts and forgetting the Salieris. Also, genres of art (opera, Impressionist painting, Broadway musicals, film noir) usually blossom and fade in a finite span of time. It’s hard to recognize nascent art forms when they are on the rise, and by the time they are widely appreciated their best days are behind them. Cowen also notes, citing Hobbes, that putting down the present is a backhanded way of putting down one’s rivals: “Competition of praise inclineth to a reverence of antiquity. For men contend with the living, not with the dead.”20
But in three circumscribed areas the arts really do have something to be depressed about. One is the traditions of elite art that descended from prestigious European genres, such as the music performed by symphony orchestras, the art shown in major galleries and museums, and the ballet performed by major companies. Here there really may be a drought of compelling new material. For example, 90 percent of “classical music” was composed before 1900, and the most influential composers in the twentieth century were active before 1940.21
The second is the guild of critics and cultural gatekeepers, who have seen their influence dwindle. The 1939 comedy The Man Who Came to Dinner is about a literary critic who achieved such celebrity that we can believe that the burghers of a small Ohio town would coo and fawn over him. It is hard to think of a contemporary critic who could plausibly inspire such a character.
And the third, of cou
rse, is the groves of academe, where the foibles of the humanities departments have been fodder for satirical novels and the subject of endless fretting and analyzing.
After nineteen chapters, you can probably guess where I will seek a diagnosis for these three ailing endeavors. The giveaway lies in a statement (attributed to Virginia Woolf) that can be found in countless English course outlines: “In or about December 1910, human nature changed.”22 Woolf was referring to the new philosophy of modernism that would dominate the elite arts and criticism for much of the twentieth century, and whose denial of human nature was carried over with a vengeance to postmodernism, which seized control in its later decades. The point of this chapter is that the elite arts, criticism, and scholarship are in trouble because the statement was wrong. Human nature did not change in 1910, or in any year thereafter.
ART IS IN our nature—in the blood and in the bone, as people used to say; in the brain and in the genes, as we might say today. In all societies people dance, sing, decorate surfaces, and tell and act out stories. Children begin to take part in these activities in their twos and threes, and the arts may even be reflected in the organization of the adult brain: neurological damage may leave a person able to hear and see but unable to appreciate music or visual beauty.23 Paintings, jewelry, sculpture, and musical instruments go back at least 35,000 years in Europe, and probably far longer in other parts of the world where the archaeological record is scanty. The Australian aborigines have been painting on rocks for 50,000 years, and red ochre has been used as body makeup for at least twice that long.24