The social and historical analysis of science poses no threat to the institution’s core assumption about the existence of an accessible “real world” that we have actually managed to understand with increasing efficacy, thus validating the claim that science, in some meaningful sense, “progresses.” Rather, scientists should cherish good historical analysis for two compelling reasons. First, real, gutsy, flawed, socially embedded history of science is so immeasurably more interesting and accurate than the usual cardboard pap about marches to truth fueled by universal and disembodied weapons of reason and observation (“the scientific method”) against antiquated dogmas and social constraints. Second, this more sophisticated social and historical analysis can aid both the institution of science and the work of scientists—the institution, by revealing science as an accessible form of human creativity, not as an arcane enterprise hostile to ordinary thought and feeling, and open only to a trained priesthood; and the individual, by fracturing the objectivist myth that only generates indifference to self-examination, and by encouraging study and scrutiny of the social contexts that channel our thinking and frustrate our potential creativity. In fact, and speaking now to my colleagues in science, I can cite no better example for the benefit of foxy diversity, gained from reading humanistic analyses of the social role and personal psychology of scientists, in abetting our hedgehog’s goal of doing “straight” science even better.

  HOW NOW DICHOTOMY—AND HOW NOT

  If, in general, dichotomy represents such a false mode for parsing either the structure of nature or the forms of human discourse; and if, in particular, we have erred grievously every time in depicting the history of interaction between science and the humanities as a series of episodes in dichotomous struggle, then why does this fallacy of reasoning, like the proverbial bad penny, keep turning up to poison our understanding and sour our relationships? I would end this critical yet hopeful commentary (for the optimistic side of my being compels me to believe that the exposure of a fallacy can lead to its correction, whatever the odds or the entrenchments) by reiterating three major reasons for the hold of dichotomy upon our schemes and perceptions. The third and most important factor also grants me the literary license to end this meandering section in tight and recursive form by returning to the opening discussion of Francis Bacon, the much misunderstood and underappreciated avatar of the Scientific Revolution, but also a wise social and philosophical critic who, so long ago, presented the best refutation of dichotomy, both in the lesson of his life and the content of his argument.

  1. The turf wars of history. However tight the logic of respectful separation may be, and however salutary the benefits of such equal and mutually supportive regard might prove, a basic foible of human affairs prevents the achievement of such gracious sharing when the history of turf—whether the prize be actual land and resources or just intellectual space—begins with one side as steward of the totality. No one (or at least no institution in full unanimity) cedes turf voluntarily, however ultimately beneficial the move and strategy. Thus, if basic human inquisitiveness forces us to ask great questions about why the sky is blue and the grass green, and if, faute de mieux, this discourse fell under the rubric of theology before modern science arose to claim proper dominion over factual aspects of such inquiries about the natural world, then some theologians will resist (while others will see farther and strongly approve) the exit of religion from a domain that never properly fell under its competence.

  Similarly, if Renaissance humanists once assumed that their techniques of locating and explicating Ancient texts could best resolve all questions about factual nature, then some adherents to this orthodoxy will resist the legitimate claims of a new institution—modern science—for observation and experiment as a more effective pathway to the same goal. With goodwill and the passage of time, these inevitable roilings and suspicions should settle down into an honorable peace based on advantages for both sides (a “win-win” situation in the jargon of our times). But we should probably regard the initial (and heated) skirmishes as unavoidable—the basic theme of the first part of this book, on the “rite and rights of an initiating spring” for modern science. And we should confine our task to deploring and correcting the continuation of such a conflict well beyond this early period of legitimacy—as this inevitable opening move can only become destructive once a novel field has secured its birthright, for generosity and mutual support should then prevail.

  2. The hopes of psychology. Scientists must understand the limits of their calling for a second practical and powerful reason beyond the first argument above, about turf wars. We live in a vale of tears, and bad things often happen to good people. These unpleasant facts about life cannot be avoided. Therefore, and especially, we need to sustain a realm of human goodness, and a calm place of optimism based on value and meaning, amid realities that we yearn to avoid but cannot deny. Yet our hopes and needs run so high that, until the reality of reiterated experience forces us to bite the bullet and bow to the inevitable, we also try to invest factual nature with the sustaining myths of “all things bright and beautiful,” or the psalmist’s vain hope (37:25) and massive self-deception: “I have been young, and now am old; yet have I not seen the righteous forsaken, nor his seed begging bread.”

  Science can only document these realities that all of us would rather deny or mitigate. And because humans have long practiced a lamentable tendency to slay the innocent messenger of bad news, science does need to specify and defend its role as a messenger and not a moralizer, and then to insist that the message, properly read (admittedly against the hopes and traditions of ages), truly contains seeds of resolution and grounds for genuine optimism. That is, science must insist that, whatever the factual state of nature, our yearnings and quest for morality and meaning belong to the different domains of the humanities, the arts, philosophy, and theology—and cannot be adjudicated by the findings of science. Facts may enrich and enlighten our moral questions (about the definition of death, the beginning of life, or the validity of using embryonic stem cells in biological research). But facts cannot dictate the answers to questions about the “oughts” of conduct or the spiritual meanings of our lives. If we keep these distinctions clear, then nature’s unpleasant facts, as ascertained by science, pose no threat to humane studies, and may even foster our discourse in morality and art by posing new issues in different ways.

  Still, scientists must recognize and understand how legitimate fear often trumps solid logic to cast unfair suspicion upon a messenger, especially when such a long tradition fuels the false dichotomy and resulting enmity. Thus I do acknowledge how much Wordsworth loved nature, and I do not begrudge his fears, though I must criticize his argument, when he wrote so famously, in a beautiful, but tragically flawed, verse:Sweet is the lore which nature brings,

  Our meddling intellect

  Distorts the beauteous forms of things.

  We murder to dissect.

  I would only say to the poets that science must dissect as one path to understanding, but never to destroy the beauty and joy of wholeness. And I do regret that some of my colleagues have made rash claims for granting science a decisive role in aesthetic and moral judgment. To all our Wordsworths, I would only grant assurance and strongly affirm that my profession can never challenge, and should only admire, your identification and reverence for those “thoughts that do often lie too deep for tears,” to cite the final line of the Ode on Intimations of Immortality, judged by Emerson (and I agree) as the finest poem ever written in the English language. I would also remind Mr. Wordsworth that the “host of golden daffodils,” his embodiment of joy in nature, grew within my realm and under my rules—and that I experience nothing but pleasure and gratitude in learning about his appreciation and inspiration.

  3. The inborn habits of dichotomy. I have argued throughout this part that, however intensified by particular reasons of history and psychology, the affliction of dichotomy—the basis for our false, yet persistent, model of opposit
ion between science and the humanities—probably lies deep within our neurological wiring as an evolved property of mental functioning, once adaptive in distant ancestors with far more limited brain power, but now inherited as cognitive baggage. This impediment from our evolutionary past engenders great harm in leading us to misunderstand the complexities that now define our lives and dangers—thus overwhelming whatever benefit dichotomy might still provide in simplifying the immediate cognitive decisions that defined the “do or die” of some ancient forebears, but that now rarely impact our current lives in the same way.

  In an admittedly ironic paradox of recursion (the requirement that mind must reflect upon mind in order to break the primary impediment), our best chance for exposing and expunging the fallacy of dichotomous opposition between science and the humanities lies in showing that a powerful myth about scientific procedure—the legend that spawned the impression of science as an objective activity, strictly divorced from all the mental quirks and subjectivities underlying creative work in the humanities—founders on a false assumption best exposed by scrutinizing such intrinsic mental biases as our propensity for dichotomy itself. These universal cognitive biases affect the work of scientists as strongly as they impact any other human activity—perhaps with even greater force because scientists have so firmly enclosed themselves within an ideology that denies the efficacy, or even the existence, of such biases. And what influence can be more pervasive or insidious than a strong effect that cannot be perceived because the rules of the game preclude a proper perception of the problem?

  This myth of objectivity—the belief that scientists achieve their special status by freeing their minds of constraining social bias and learning to see nature directly under established rules of “the scientific method”—drives a wedge between science and the humanities, because historians, sociologists, and philosophers of science know that such a mental state cannot be achieved (while they do not doubt the ability of science to gain reliable factual knowledge about the natural world, even if this knowledge must be obtained in curiously roundabout ways by flawed human reasoning); whereas scientists mistake these truthful and helpful analyses by colleagues in the humanities as attacks upon the purity of their enterprise, rather than an intended affirmation that all our mental activities, including science, can only be pursued by gutsy human beings, warts and all (and that we often learn more from the warts than from the idealizations).

  If scientists would admit the ineluctable human character of their enterprise, and if students of science within the humanities would then acknowledge the power of science to increase the storehouse of genuine knowledge by working with all the flaws of human foibles, then we could break the hold of dichotomy and break bread together. The first, and in many ways still the best, analysis of the inherent mental biases underlying all scientific work resides in the most important treatise written by Francis Bacon himself—a particularly ironic situation because Bacon’s name then became associated with the opposite position that has fueled the flames of dichotomy for centuries. For reasons described just below, the “objective” process of simply recording facts, and then drawing logical inferences from these lists of facts alone, became known, in anglophone jargon, as “the Baconian method,” thus tying the name of this avatar of the Scientific Revolution to the myth that then drove a wedge between science and other intellectual activities—not Bacon’s intention at all, as we shall see.

  For example, in a famous statement from his autobiography, Charles Darwin, with uncharacteristic misunderstanding (or misremembering) of his own life and work, described his initial inklings about evolution: “My first notebook was opened July 1837. I worked on true Baconian principles and without any theory collected facts on a wholesale scale.” Of course, Darwin did not, and could not, so proceed. From the very beginning he tested, retested, proposed, rejected, and refined a wide and everchanging spate of theoretical assumptions, until he finally developed the theory of natural selection by a complex coordination of mental preferences and factual affirmation. To refute his own naive claim, I need only restate my favorite Darwinian line, cited several times before: “How odd it is that anyone should not see that all observation must be for or against some view if it is to be of any service.”

  Bacon’s dubious, and wholly undeserved, reputation as the apostle of a purely enumerative and accumulative view of factuality as the basis for theoretical understanding in science rests upon the tables for inductive inference that he included in the Novum Organum, the first substantive section following the introduction to his projected Great Instauration. Bacon, who has never been accused of modesty, had vowed as a young man “to take all knowledge for my province.” To break the primary impediment of unquestioned obeisance to ancient authority (the permanence and optimality of classical texts), Bacon vowed to write a Great Instauration (or New Beginning) based on principles of reasoning that could increase human knowledge by using the empirical procedures then under development and now called “science.”

  Aristotle’s treatises on reasoning had been gathered together by his followers and named the Organon (tool, or instrument). Bacon therefore named his treatise on methods of empirical reasoning the Novum Organum, or “new instrument” for the Scientific Revolution. The “Baconian method,” as Darwin used and understood the term, followed the tabular procedures of the Novum Organum for stating and classifying observations, and for drawing inductive inferences therefrom, based on common properties of the tabulations.

  Perhaps Bacon’s tables do rely too much on listing and classifying by common properties, and too little on the explicit testing of hypotheses. Perhaps, therefore, this feature of his methodology does buttress the objectivist myth that has so falsely separated science from other forms of human creativity. But when we consider the context of Bacon’s own time, particularly his need to emphasize the power of factual novelty in refuting a widespread belief in textual authority as the only path to genuine knowledge, we may understand an emphasis that we would now label as exaggerated or undue (largely as a consequence of science’s preeminent success).

  Nonetheless, a grand irony haunts the Novum Organum, for this work, through its tabular devices, established Bacon’s reputation as godfather to the primary myth of science as an “automatic” method of pure observation and reason, divorced from all sloppy and gutsy forms of human mentality, and therefore prey to the dichotomous separations that have so falsely represented the relations of science and the humanities for more than three hundred years of Western history. In fact, the most brilliant sections of the Novum Organum—scarcely hidden under a bushel by Bacon, and well known to subsequent historians, philosophers, and sociologists—refute the Baconian myth by defining and analyzing the mental and social impediments that lie too deeply and ineradicably within us to warrant any ideal of pure objectivism in human psychology or scholarship. Bacon referred to these impediments as “idols,” and I would argue that their intrusive inevitability fractures all dichotomous models invoked to separate science from other creative human activities. Bacon should therefore be honored as the primary spokesman for a nondichotomized concept of science as a quintessential human activity, inevitably emerging from the guts of our mental habits and social practices, and inexorably intertwined with foibles of human nature and contingencies of human history—not apart but embedded, yet still operating to advance our general understanding of an external world and therefore to foster our access to “factual truth” under any meaningful definition of such a concept.

  The old methods of syllogistic logic, Bacon argues, can only manipulate words and cannot access “things” (that is, objects of the external world) directly:8 “Syllogism consists of propositions, propositions of words, and words are the tokens and marks of things.” Such indirect access to things might suffice if the mind (and its verbal tools) could express external nature without bias; but we cannot operate with such mechanistic objectivity: “If these same notions of the mind (which are, as it were, the soul of wo
rds) . . . be rudely and rashly divorced from things, and roving; not perfectly defined and limited, and also many other ways vicious; all falls to ruin.” Thus, Bacon concludes, “we reject demonstration or syllogism, for that it proceeds confusedly; and lets Nature escape our hands.”

  Rather, Bacon continues, we must find a path to natural knowledge—as we develop the procedure now known as modern science—by joining observation of externalities with scrutiny of internal biases, both mental and social. For this new form of understanding “is extracted . . . not only out of the secret closets of the mind, but out of the very entrails of Nature.” As for the penchants and limitations of mind, two major deficiencies of sensory experience impede our understanding of nature: “the guilt of Senses is of two sorts, either it destitutes us, or else deceives us.”

  The first guilt, “destitution,” identifies objective limits upon physical ranges of human perception. Many natural objects cannot be observed “either by reason of the subtlety of the entire body, or the minuteness of the parts thereof, or the distance of place, or the slowness, and likewise swiftness of motion.”

  But the second guilt, “deception,” denotes a more active genre of mental limitation defined by internal biases that we impose upon external nature. “The testimony and information of sense,” Bacon states, “is ever from the Analogy of Man, and not from the Analogy of the World; and it is an error of dangerous consequence to assert that sense is the measure of things.” Bacon, in a striking metaphor once learned by all English schoolchildren but now largely forgotten, called these active biases “idols”—or “the Idolae, wherewith the mind is preoccupate.”

  Bacon identified four idols and divided them into two major categories, “attracted” and “innate.” The attracted idols specify social and ideological biases imposed from without, for they “have slid into men’s minds whether by the placits and sects of philosophers, or by depraved laws of demonstrations.” Bacon designated these two attracted biases as “idols of the theater” for limitations imposed by old and unfruitful theories that persist as constraining myths (“placits of philosophers”); and, in his most strikingly original conception, “idols of the marketplace,” for limitations arising from false modes of reasoning (“depraved laws or demonstrations”), and especially from failures of language to provide words for important ideas and phenomena, for we cannot properly conceptualize what we cannot express. (In a brilliant story titled “Averroes’ Search,” the celebrated Argentinean writer Jorge Luis Borges, who strongly admired Bacon, described the frustration of this greatest medieval Islamic commentator on Aristotle, as he struggled without success to understand two words central to Aristotle’s Poetics, but having no conceivable expression in Averroes’s own language and culture: comedy and tragedy.)