The Delusions of Certainty
The truth is that people do not agree on the mind. There is no single theory about what it is. Confusion reigns, and not only among those who rarely think about the mind-body problem. Scientists, philosophers, and scholars of all kinds frequently clash over this question. The battles go under different names, but there are many struggles over consciousness—what it is and why people have it at all. This is interesting because hardly anyone today would argue against Copernicus. We agree that the earth revolves around the sun. No one is saying that William Harvey was wrong about heart function. Einstein’s theory of relativity is generally accepted, as are quantum mechanics, even though the two cannot be unified in a single overarching theory in physics. And yet, the battles being fought today about “the mind” under various banners have not changed all that much since the seventeenth century. There are wrinkles in the various forms of dualisms and monisms, but they are still with us.
In his book Science and the Modern World, published in 1925, Alfred North Whitehead summarized the mind-body, mind-matter quarrels:
The seventeenth century had finally produced a scheme of scientific thought framed by mathematicians, for the use of mathematicians. The great characteristic of the mathematical mind is its capacity for dealing with abstractions; and for eliciting from them clear-cut demonstrative trains of reasoning, entirely satisfactory so long as it is those abstractions which you want to think about. The enormous success of the scientific abstractions, yielding on the one hand matter with its simple location in space and time, on the other hand mind, perceiving, suffering, reasoning, but not interfering, has foisted onto philosophy the task of accepting them as the most concrete rendering of fact.
Thereby, modern philosophy has been ruined. It has oscillated in a complex manner between three extremes. There are the dualists, who accept matter and mind on an equal basis, and the two varieties of monists, those who put mind inside matter, and those who put matter inside mind. But this juggling with abstractions can never overcome the inherent confusion introduced by the ascription of misplaced concreteness to the scientific scheme of the seventeenth century.25
Whitehead was a mathematician, logician, physicist, and philosopher. Principia Mathematica, which he wrote with Bertrand Russell, remains a landmark work in logic and mathematics, although Kurt Gödel’s incompleteness theorem demonstrated that the Principia could not be both consistent and complete. Influenced by the radical upheaval of quantum mechanics, Whitehead came to reject materialism and the very idea that things are locked into a specific space and time. He offered instead a metaphysics of process, movement, and becoming. His thought is often described as a form of panpsychism. While Whitehead’s metaphysics are notoriously difficult to penetrate, his analysis of the history of science in Science and the Modern World is acute and far more accessible, whether one accepts his critique or not. He was intensely aware of what was at stake: “If science is not to degenerate into a medley of ad hoc hypotheses, it must become philosophical and must enter upon a thorough criticism of its own foundations.”26 As Whitehead argues, those foundations were laid in the seventeenth century.
Why did I choose Descartes, Hobbes, Cavendish, and Vico when there are many other interesting philosophers who have asked the same questions and come up with many interesting solutions? I am using the four merely as philosophical touchstones. Each of them offered a distinct way of understanding what it means to be a thinking person in the world. Each had a dualist or monist theory of his or her own. Each philosopher composed his or her melody of thought, melodies that continue to be heard and played, even by those who have no idea who invented the tune. Two of them, Descartes and Hobbes, have had a profound and lasting influence on philosophy, science, and many other disciplines. The other two, Cavendish and Vico, remain marginal to the dominant tradition, but they, too, have had and continue to exercise what might be called a subversive influence.
The mathematical model of abstractions Whitehead mentions is important because in the number-as-truth or triangle-as-truth model, imagination is either banished from the kingdom or plays the role of handmaiden to Reason. Whitehead saw, I think accurately, that there is an imaginative aspect to all thought: “Every philosophy is tinged with the colouring of some secretive imaginative background, which never emerges explicitly into its trains of reasoning.”27 The imagination, which now is often understood as a synonym for “creativity,” was traditionally used in philosophy to describe mental imagery as opposed to sensory perception. As I sit here writing, I see the coffee cup beside me, the papers spread on my desk along with open books and a small red-and-black clock. When I leave the room, I can call up an image of my messy desk, but imperfectly. The imagination consists of all the sights, sounds, smells, and feelings retained when a person remembers an event or place and when she fantasizes about an event that never happened or imagines a place where she has never been. If I had never perceived anything, I could not remember or imagine. For Hobbes, step-by-step reasoning was superior to the imagination, which he understood as a form of memory or “decayed sense,” a duller version of actual sensory perception. Descartes used the imagination, fantaisie, as a convenient intermediate area between direct sensation (pure bodily experience) and reasoning (pure mental experience). Through the imagination he found a way in which the body and mind could interact. For Cavendish, imagination (fancy) and emotion, along with reason, are all crucial to knowledge. Vico believed that memory, imagination, and metaphor originate in the body and its senses and are necessary to the story of thought itself.
(This is a personal essay, a work in which I try to understand what has been hard for me to understand. It is not a survey of Western philosophy or even an examination of my four touchstone philosophers. Nevertheless, I am driven by a sense of urgency, in part because the unsolved problems of the mind and body are often treated as if they were behind us, not only in the media, always prone to sensationalism and easy answers, but in philosophy and science. Over and over again, I have been confronted with books, papers, blogs, and articles that make blithe assumptions about how the mind or the brain-mind works and therefore about the nature of human beings. Often the underlying assumptions are hidden, even from the people who are building the arguments.
(I hope to pry open some of these foundational beliefs or confused premises by asking questions that do not have ready answers. I want to take up a number of subjects, some of them popular, others more obscure, which, at the very least, will present the reader with the fact that much remains unknown about the mind and its relation to the body and the world. I confess I am also on a mission to dismantle certain truisms that have been flying at me from left and right for years, truisms about nature, nurture, genes, twin studies, and the “hardwired” brain. I have grown weary of smug assertions about hormones and psychological sex differences, the frankly crude declarations of evolutionary psychology, and some of the fantasies that are writ large in artificial intelligence. I address other subjects, however, such as placebo effects, false pregnancy, hysteria, and dissociative identity disorder because these illnesses and bodily states illustrate the gaps in current knowledge about the mind-brain. I also discuss phenomenology, the study of consciousness from a first-person point of view, to see if insights from that discipline might help frame the problem of mind. I could have chosen any number of other topics and subjected them to the same interrogations. But I am less interested in the specific targets of my research than I am in showing how often the old problems of monism and dualism, mind and body, inside and outside, haunt scientific and scholarly research.
(There is something else that fascinates me as well: the imaginative background Whitehead mentions. It colors philosophy and science and scholarship of all kinds, even when it is not acknowledged. Dreams of purity and power and control and better worlds as well as fears of pollution, chaos, dependency, and helplessness tinge even the most rigorous modes of thought. Sometimes that imaginative background is a riot of bold color and sometimes it is a faint pastel w
ash, but it inevitably shows itself even when it isn’t present in “trains of reasoning.” I am not of the opinion that these imaginative backgrounds are bad. I am rather of the opinion that the attempt to purge thought of its imaginative background, whether brilliant or pale, is a mistake. My embellishment of Whitehead’s metaphor, by which I turn his background into a canvas, is conscious, not unconscious. Like Vico, I think metaphor is not only inescapable but essential to thinking itself.)
Let me return to my informal survey. Were there any answers to my questions about the mind that struck me as particularly astute? There were two. One clever man told me that the mind is the thoughts the brain produces. A clever woman told me the mind is consciousness, and the brain is the organ of consciousness. Neither of them is a philosopher. The man is a writer, and the woman an actress, but both of them had wondered and worried about the mind question. I took their answers to mean that a person’s internal experience of thinking and more broadly the experience of consciousness itself are different from simply understanding the brain’s operations, even if the brain is responsible for thoughts and consciousness.
Received Ideas and M
About a year ago, I was at a small dinner. One of the guests was a white, male, educated, left-leaning, well-regarded novelist, who declared as fact that some people, usually men, are born with a feeling of entitlement. I will call this person M. When I asked M what he meant by this, it became clear that he was not talking about being born white or into class privilege or the superiority traditionally assigned to men over women. This inborn, entitled masculinity he referred to was secreted in the genes as an innate quality or gift. I suspected he was referring to his own genetic makeup, but that may be unfair. He seemed to have picked up this remarkable idea from an article he had read. He had forgotten the source, but I don’t doubt something of the kind has been written somewhere. Inborn psychological traits are very popular these days. Whatever he had read, he was insistent that there was a gene or several genes for a sense of entitlement, and he clung to it for dear life. What is it about some thoughts in the sciences that make them popular? And why do other ideas remain buried in universities? Why do some highly controversial notions move into the wider culture as accepted facts, when fierce battles are being waged over them inside the academy?
One explanation is that in popular culture science is often perceived as monolithic. We are continually confronted with new discoveries and new facts. “Scientists Discover . . .” and “New Scientific Study Finds . . .” are familiar beginnings for headlines. And while everyone who reads newspapers is aware that “scientists” come from many disciplines and frequently change their minds about this and that and do not always agree, there is a powerful sense that they, the scientists, are on an inexorable march forward, that knowledge accumulates bit by bit as it methodically uncovers the secrets of “nature.” The reason for this widespread belief is not obscure. We who live in developed countries, as well as many people who don’t, are the awestruck witnesses to technological changes that are nothing if not a testament to scientific research and its practical application. If you are as old as I am, that means everything from color TV to faxes to computers and cell phones and the Internet, as well as increasingly sophisticated missiles and drones. Quantum theory—that alarming and paradoxical account of nature that turned physics upside down—has been used to make objects such as lasers, which among many other things can improve one’s eyesight. “Cloning” is now a term in general use, as is “stem cell research,” “artificial insemination,” and “iris recognition scanner.”
The models that aid scientists in their manipulation of the natural world have been wonderfully effective, and the changes in our lives serve as proof that there is a relation between theories in science and the natural world. This undeniable reality, however, has had a blinding effect on many. The histories of philosophy and science are often forgotten in a rush to believe what we want to believe. Good ideas regularly go missing, and bad ideas often win the day. Good ideas are sometimes found again, but not always. Why some thoughts live and others die depends on multiple factors, most often, the context for their understanding. A man or a woman can sit in a room and think critically and well and write books and publish them, but that is no guarantee that others will be able to comprehend those ideas or go on to embrace them. The ideas must resonate in the culture. Margaret Cavendish’s ideas were either ridiculed or hidden for three hundred years. She remains far more obscure than Descartes and Hobbes and is still ignored by many scholars of the period. Her biggest problem was that she was a she.
Crucially, too, we must recognize that the language we use to think about “nature,” which many believe includes a natural, not supernatural, mind, plays an important role in what can be seen, discovered, and manipulated. Although Hobbes’s desire to purify language of all metaphor and reduce it to exact definitions remains alive in the sciences, metaphors abound and sometimes turn into literalisms that shut down rather than open up thinking. Also, there is no working science without a hypothesis. Before a person begins to do research, he has to have an idea of what he might find, and those ideas are inevitably shaped by earlier ideas and the metaphors used to understand them that were not lost, as well as a desire to prove the working hypothesis right or wrong.
Goethe was penetrating about how a hypothetical idea can take hold and infect one generation after another:
A false hypothesis is better than none; there is no danger in a false hypothesis itself. But if it affirms itself and becomes generally accepted and turns into a creed which is no longer doubted and which no one is allowed to investigate: this is the evil from which centuries will suffer.28
In his introduction to The Structure of Scientific Revolutions (1962), Thomas Kuhn asserted that no group of scientists could work without “some set of received beliefs” about what the world is like.29 He argued that before any research can be done, every scientific community must have agreed on the answers to a number of fundamental questions about the world and that these answers are embedded in the institutions that train scientists for their work. Kuhn, who began his career as a physicist, continues to distress his fellow scientists because the notion that the foundations of scientific work may be shaky remains a subversive position. In fact, the hostility toward Kuhn among scientists has often startled me. The mention of his name is enough to cause a bristling response. He is often viewed as someone who wanted to undermine science altogether, but I have never read him that way.
Like Whitehead, Kuhn understood that science rests on a foundation that is assumed and does not begin at the beginning. If every graduate student in biology were presented with Descartes’s first question and asked to confirm her own existence and the existence of the world beyond her, she would be stopped in her tracks. “Normal science,” for Kuhn, consisted of “a strenuous and devoted attempt to force nature into the conceptual boxes supplied by professional education.” He went on to wonder “whether research could proceed without such boxes, whatever the element of arbitrariness in their historic origins and, occasionally, in their subsequent development.”30 Whitehead, Goethe, and Kuhn agree that there are received beliefs in science. Whitehead challenges the received truths about material reality established in the seventeenth century and the tendency in science for misplaced concreteness, mistaking the mathematical abstraction for the actuality it represents. The danger for Goethe is that an enduring hypothesis becomes truth and therefore goes unquestioned. For Kuhn, normal science floats along on the consensual, often unexamined beliefs he calls paradigms until some discovery, some intractable problem, explodes those same foundational convictions. He sees paradigm change as the upheaval that causes scientific revolutions.
What Are We? Nature/Nurture: Hard and Soft
The sense of entitlement M believed was inborn in male persons (which he might have confused with studies on dominance in male rats or other mammals lifted from evolutionary psychology, whose proponents argue that male dominance
is a Darwinian sexually selected trait) is an old idea. Thoughts about male superiority have been framed in any number of ways since the Greeks, but in contemporary culture this still robust notion is usually understood in terms of a more recent history. Since the late nineteenth century in the West, human development has often been seen as a tug-of-war between nature and nurture. M was convinced that a feeling he called entitlement came by way of male nature, not nurture. What are the assumptions that make this nature/nurture division possible? Despite continual announcements from people in many disciplines that the divide is a false one or that it has been resolved, it continues to haunt scientists, scholars, and the popular imagination. In its starkest terms, the opposition looks like this: If people are destined at birth to turn out one way or another, then our social arrangements should reflect that truth. On the other hand, if people are largely creatures of their environments, then society should acknowledge this fact. The potential consequences of deciding that one is more important than the other are immense. Asking how much nature and how much nurture are involved in our development assumes that the two are distinct, that a neat line can be drawn between them and, with some luck, be measured.
A cartoon version of nature and nurture involves simple notions of inside/outside or person/environment. Two children are born and set out on the path of life. Along the way, they both learn to talk and sing and dance and read. They both encounter obstacles and trip over them. Both are nearly drowned in a flood, but one child grows into a strong, resilient adult and the other withers, becomes ill, and dies young. Why? A current and popular idea, articulated by M at the dinner and by many others, is that there is something inside the strong child, some hard or soft hereditary substance that helps him to survive external buffeting that the weak child does not have or has less of. That stuff is usually called our genes. In everyday conversation, people often refer to good genes and bad genes. The person who has lived on legumes and never touched a cigarette or a glass of wine and runs fervently every day but who drops dead at thirty-eight is often explained as a case of bad genes. Every day we read about scientists who have found a new “gene” for this or that illness. Not wholly unrelated to genes and equally popular in the media is the idea that our brains have been “hardwired” for this or that behavior. Choosing spouses, believing in God, what we find beautiful, even asking for directions have all been explained through brain wiring, hard or otherwise: “It Turns Out Your Brain Might Be Wired to Enjoy Art,” “Is the Human Brain Hardwired for God?” “Male and Female Brains Wired Differently, Scans Reveal.”31 Whether the reference is to genes or to brains, the implication is that an identifiable biological material is shaping our destiny.