Descartes lost his mother at an early age and became a thinker who gave birth to the truth out of the lonely chamber of his own head. Surely this is not without significance. Margaret Cavendish was a female philosopher in the seventeenth century. Women simply did not publish books on natural philosophy in the seventeenth century, although there were a number of women who wrote and engaged in philosophical debates, especially in the form of correspondences. Is Cavendish’s position as a woman in that culture at that time unrelated to her idea that “man” is not the only creature in the universe possessed of reason? Isn’t it reasonable to acknowledge that her marginalized position gave her a perspective most of her philosophical peers at the time could not share, but also insights to which they were blind? Like many children of his class in England, John Bowlby saw little of his parents and was raised by a nanny whom he loved. According to most accounts, she left the household when he was four years old, and he suffered from the loss. At eleven, Bowlby was sent to boarding school, where he was not happy. In private, he admitted that his childhood experiences had influenced him. He understood himself to “have been sufficiently hurt but not sufficiently damaged.”378 Presumably, he meant by this that the pains of his childhood gave him a way into questions of attachment without ruining him as a person.

  Objective methodologies are important. Studies that are replicated over and over again may be said to have “proven” this or that fact, but as Bateman’s fruit fly study shows, sometimes results that conform to expectation are so welcome they become true for hundreds of scientists until proved wrong or at least problematic. Furthermore, ongoing research in science is forever coming up with conflicting results, as is easily seen in spatial rotation studies, but that is just a single example. There are hundreds of others. And research that is inconclusive or comes up with no result usually remains obscure. For all the publicity her twin study received, Myrtle McGraw’s originality was ignored for years. Then again, her tone was careful and thoughtful, and she didn’t loudly advertise a conclusion that supported either heredity or experience. On top of it, she was a woman. The genius of women has always been easy to discount, suppress, or attribute to the nearest man.

  When a person cares about her work, whether she is a poet or a physicist, doesn’t her labor involve identification and attachment? In every discipline, without exception, human beings attach themselves to ideas with a passion that is not dissimilar to our love for other people. If you have spent most of your life on Virginia Woolf or have been immersed for thirty years in the writings of John Dewey, you may well inflate their importance in the history of literature and ideas, at least in the eyes of people who have spent their lives on other texts or have no interest in books at all. If you have made a career in evolutionary psychology, arguing that human beings have evolved hundreds of discrete problem-solving mental modules, is it surprising if you discount evidence that suggests those modules may not exist? If your entire career rests on the assumption that the mind is a computational device, are you likely to abandon this idea just because there are people hard at work writing papers in other fields who insist you are wrong? A neuroscientist who has devoted herself to a particular part of the brain, say the insula or the hippocampus or the thalamus, is bound to have a somewhat aggrandized feeling about that particular brain area, just as someone who has been working on the “connectome” will be devoted to the idea that the complete mapping of a creature’s neural connections will be invaluable to the future of science. Could it be any other way?

  I am deeply attached to the novel as a form of almost enchanted flexibility. I believe in it, and, unlike many people, I think reading novels expands human knowledge. I also believe it is an extraordinary vehicle for ideas. I make my living writing novels. My attraction to phenomenology and psychoanalysis, both of which explore lived experience, fits nicely with my interest in literature. The novel is a form that addresses the particularity of human experience in one way or another. Phenomenology investigates consciousness from a first-person point of view, and psychoanalysis is a theory that includes the minutiae of a patient’s day-to-day life. If you couple the inevitable attachment many of us feel for our work, if we are lucky enough to feel attached to it, with a need to “advance” in a field of choice, then it is hardly odd that people who care about their work form attachments to its content that cannot be described as objective. These passions are, in fact, subjective, but they are also intersubjective because no one, not even the novelist, works entirely alone. She sits in a room of her own and writes, but she is in that room with others, not only the real people who have shaped her unconscious and conscious imagination, but also fictive people and the voices of hundreds of people now dead who left their words in the books she has read.

  Human beings are animals with hearts and livers and bones and brains and genitals. We yearn and lust, feel hunger and cold, are still all born from a woman’s body, and we all die. These natural realities are inevitably framed and understood through the culture we live in. If each of us has a narrative, both conscious and unconscious, a narrative that begins in the preverbal rhythms and patterns of our early lives, that cannot be extricated from other people, those to whom we were attached, people who were part of shaping the sensual, muscular, emotional rhythms that lie beneath what become fully articulated narratives intended to describe symbolically the arc of a singular existence, then each of us has been and is always already bound up in a world of others. Every story implies a listener, and we learn how to tell stories to make sense of a life with those others. Every story requires both memory and imagination. When I recall myself at six walking to school, I am not really back in my six-year-old body. I must travel back in time and try to imagine what I felt like then. When I imagine the future, I rely on the patterns of the past to frame what may happen next Thursday. When I invent a character I use the same faculty. I draw on that continuum of memory and imagination. Human beings are predictive, imaginative creatures who navigate the world accordingly.

  Could it be that the language we have to speak about what we are has itself become intractable? How far have we come from Descartes and Hobbes and Cavendish and Vico? How are we to think of minds and bodies or embodied minds or bodies with brains and nervous systems that move about in the world? In what way are these biological bodies machines? Does what we call mental life emerge from a developing organism or is matter shot through with mind as some physicists and panpsychist philosophers have argued? Is the whole person more than her parts or can she be broken down like a car engine? Exactly how do we understand an individual’s borders in relation to what is outside him? Is it possible to have a theory about the mind or the world or the universe that doesn’t leave something out? Should we turn away from things we can’t explain?

  When I think of these questions, I travel back to childish thoughts, to when I lay in the grass and watched the clouds and thought how strange it was to be alive, and I placed my hand on my chest to feel my heart beat and counted until I got bored. Sometimes I would say a word to hear it move from inside my head to my mouth and then outside me as sound. Sometimes I would feel I was floating and leaving my body behind. I liked to go to a place behind my family house where the fat roots of a tree extruded from the steep banks above a creek and curled to form a chair where I could sit and meditate on the same questions I have meditated on in this essay, albeit from inside a much younger, naïve self, who lived in another time and place. My recollection of those reveries stays alive in me only from my current perspective in an ever-moving present. Over and over, I thought how strange it was to be a person, to see through eyes, and smell through something that poked out of the middle of my face and had holes in it. I would wiggle my fingers and stare at them amazed. Aren’t tongues awfully odd? Why am I “I” and not “you”? Are these not philosophical thoughts? And don’t many children have them? Isn’t good thinking at least in part a return to early wonder? Every once in a while, I tried to imagine being nowhere—that is, never having b
een anywhere. For me, it was like trying to imagine being no one. I still wonder why people are so sure about things. What they seem to share is their certainty. Much else is up for grabs.

  Coda

  I do not know who put me into the world, nor what the world is, nor what I am myself. I am terribly ignorant about everything. I do not know what my body is, or my senses, or my soul, or even that part of me that thinks what I am saying, which reflects about everything and about itself, and does not know itself any better than it knows anything else.379

  Blaise Pascal—mathematician, physicist, religious thinker—wrote these words in his Pensées, a collection of notes for a work he did not live to write, but which were published in 1669, seven years after his death. Pascal knew a lot. He invented an early calculating machine, the syringe, the hydraulic press, and a roulette machine, and he pioneered an early version of the bus system. His work on barometric pressure resulted in Pascal’s law. He contributed theorems to geometry and binomial mathematics. Nevertheless, his claim to ignorance must be taken seriously. The domains of ignorance he mentions—about the soul or psyche, about the sensual body, as well as about the nature of reflective self-consciousness, that part of a person that can think about the world around himself and about himself and his own thoughts—remain mysterious in ways general relativity does not.

  This can be hard to accept because if anything seems to exist on a high and rarefied plane it is physics. After all, what could be more important than puzzling out the secret laws of the universe? And yet, the physicists who have entered into the consciousness debates do not have one answer; they have different answers. Many biologically oriented scientists point to their own hands-on, close-up research that appears to fly in the face of timeless mathematical reduction. To borrow an image from Cavendish’s world: The worm- and fish-men are in conflict with the spider-men.

  It is true that since the seventeenth century most people have lived in an age of science and have reaped the benefits and lived the nightmares of its discoveries. The “mind,” however, has been an especially bewildering concept, one fought over fiercely for centuries now. Computational theory of mind has deep roots in the history of mathematical abstraction in the seventeenth century and its “misplaced concreteness,” as Whitehead called it, mistaking an abstraction or model for the actuality it represents. With its mind-body split and its prejudice against the body and the senses, this tradition also harbors, sometimes more and sometimes less, strains of misogyny that have infected philosophy since the Greeks. The brain science that adopted the computer as its model for the mind cannot explain how neural processes are affected by psychological ones, how thoughts affect bodies, because the Cartesian split between soul and body continues to thrive within it.

  These scientists have ended up in a peculiar place. Descartes related his rational immaterial soul to God. The immaterial soul of the present appears to be disembodied information. Some of the AI scientists who have embraced the latter kind of soul have been led step-by-step to its logical conclusion: an imminent supernatural age of immortal machines. Computation has become increasingly sophisticated and ingenious, but I believe computational theory of mind as it was originally understood in cognitive science will eventually breathe its last breath, and the science historians of the future will regard it as a wrong turn that took on the qualities of dogma. I could be wrong, but I have read nothing that leads me to believe otherwise. Personally, I think the corporeal turn is a move in the right direction.

  But then I, too, am a situation, the product of years of particular experiences that include reading and writing and thinking and loving and hating, of seeking and finding and losing and seeking again. I did not make myself but was made in and through other people. I cannot begin to understand myself outside my own history, which includes my whiteness and femaleness and class and privileged education, as well as my tallness and the fact that I like oatmeal, but also myriad elements that I will never be able to name, bits and pieces of a life lived but long forgotten or sometimes half remembered in the way dreams are, with no guarantee that it was actually like that at all.

  I am still a stranger to myself. I know I am a creature of unconscious biases and murky, indefinable feelings. Sometimes I act in ways I can’t comprehend at all. I also know that my perception of the world is not necessarily another person’s perception of it, and I often have to work to discover that alien perspective. Other times I seem to feel what another person feels so well, it is almost as if I have become him. Some fictional characters are much more important to me than real men and women. Every discipline has its own myths and fictions, for better and for worse. Many words slide in meaning depending on their use. The words “genes,” “biology,” “information,” “psychological,” “physiological” change so often, depending on their contexts, that confusion is bound to result.

  My own views have been and are subject to continual revision as I read and think more about the questions that interest me. Openness to revision does not mean a lack of discrimination. It does not mean infinite tolerance for rank stupidity, for crude thinking, or ideology and prejudice masquerading as science or scholarship. It does not mean smiling sweetly through inane social chatter about genes, hardwiring, testosterone, or whatever the latest media buzz has on offer. It means reading serious texts in many fields, including the arts, that make unfamiliar arguments or inspire foreign thoughts you resist by temperament, whether you are a tough-minded thinker or a tender-minded one, and allowing yourself to be changed by that reading. It means adopting multiple perspectives because each one has something to tell you and no single one can hold the truth of things. It means maintaining a lively skepticism accompanied by avid curiosity. It means asking questions that are disturbing. It means looking closely at evidence that undermines what you thought had been long settled. It means getting all mixed up.

  Simone Weil wrote, “Doubt is a virtue of the intelligence.”380 As with every other principle, enshrining doubt as the highest principle in thought may become merely another excuse for intolerance, but I believe there are forms of doubt that are virtuous. Doubt is less attractive than certainty to most people. The kind of doubt I am thinking of doesn’t swagger. It doesn’t shake its finger in your face, and it doesn’t go viral on the Internet. Newspapers do not write about it. Military parades do not march to tunes of doubt. Politicians risk mockery if they admit to it. In totalitarian regimes people have been murdered for expressing doubt. Although theologians have understood its profound value, religious fanatics want nothing to do with it. The kind of doubt I am thinking of begins before it can be properly articulated as a thought. It begins as a vague sense of dissatisfaction, a feeling that something is wrong, an as-yet-unformed hunch, at once suspended and suspenseful, which stretches toward the words that will turn it into a proper question framed in a language that can accommodate it. Doubt is not only a virtue in intelligence; it is a necessity. Not a single idea or work of art could be generated without it, and although it is often uncomfortable, it is also exciting. And it is the well-articulated doubt, after all, that is forever coming along to topple the delusions of certainty.

  NOTES

  1. Anca M. Pasca and Anna A. Penn, “The Placenta: The Lost Neuroendocrine Organ,” NeoReviews 11, iss. 2 (2010): e64–e77, doi: 10.1542/neo.11-2-e64.

  2. Samuel Yen, “The Placenta as the Third Brain,” Journal of Reproductive Medicine 39, no. 4 (1994): 277–80.

  3. Neil K. Kochenour, “Physiology of Normal Labor and Delivery,” lecture, Library.med.utah.edu/kw/human_reprod/lectures/physiology/labor.

  4. René Descartes, Meditations on First Philosophy, trans. and ed. John Cottingham, in Cambridge Texts in the History of Philosophy (Cambridge: Cambridge University Press, 1996), 12.

  5. Ibid., 44.

  6. René Descartes, quoted in Daniel Garber, Descartes’ Metaphysical Physics (Chicago: University of Chicago Press, 1992), 122.

  7. Thomas Hobbes, Leviathan, ed. C. B. Macpherson (London: Peng
uin, 1981), 111.

  8. Ibid., 115.

  9. Ibid.

  10. Margaret Cavendish, Observations upon Experimental Philosophy, ed. Eileen O’Neill (Cambridge: Cambridge University Press, 2001), 158.

  11. Panpsychists include the seventeenth-century philosophers Baruch Spinoza (1632–77) and Gottfried Leibniz (1646–1716), the eighteenth-century English philosopher George Berkeley (1685–1753), the German philosopher Arthur Schopenhauer (1788–1860), the nineteenth-century physicist and philosopher Gustav Fechner (1801–87), the physician, philosopher, and physiologist Wilhelm Wundt (1832–1920), the American Pragmatist philosophers Charles Sanders Peirce (1839–1914) and William James (1842–1910), Alfred North Whitehead (1861–1947), the physicist David Bohm (1917–92), the French philosopher Gilles Deleuze (1925–95), and the contemporary analytical philosopher Galen Strawson (1952–). For an overview of the question, see David Skrbina, Panpsychism in the West (Cambridge, MA: MIT University Press, 2007).

  12. Cavendish, Observations upon Experimental Philosophy, 135.

  13. Margaret Cavendish, quoted in Anna Battigelli, Margaret Cavendish and the Exiles of the Mind (Lexington, KY: University Press of Kentucky), 101.

  14. Denis Diderot, Rameau’s Nephew/D’Alembert’s Dream, trans. Leonard Tancock (London: Penguin, 1976), 181.

  15. Denis Didierot, quoted in Michael Moriarty, “Figures of the Unthinkable: Diderot’s Materialist Metaphors,” in The Figural and the Literal: Problems of Language in the History of Science and Philosophy, 1630–1800, ed. Andrew E. Benjamin, Geoffrey N. Cantor, and John R. R. Christie (Manchester: Manchester University Press, 1987), 167.