Yet, in a social convention that may represent more than a cultural accident of Western life, and may well record some hard-wiring about hierarchies and dominations, we tend to equate big and up with power and righteousness, and down and small with lesser merit. In this context, Ray’s key (figure 19) for land birds advertises the potency of unconscious bias in a supposedly objective mapping of nature’s factuality. Note how, at every branching point, Ray places the socially favored of the two categories on top and the less worthy, in subjective human judgment, on the bottom—even though, as explained just above, the relative placement of the two branches does not affect the geometry of the system.
The first division places carnivorous birds like the noble eagle above, and frugivorous relatives like the blathering parrots below. The lower frugivore branch then trifurcates by body size into big, medium, and little, so arrayed from the preferred top to the subservient bottom. Meanwhile, the carnivores above undergo a basic division into worthy denizens of daylight above, and stealthy creatures of the night below. The nocturnal category then splits into haves and have-nots, with horned owls above and non-horned owls below (figure 20). The blessed day-fliers, meanwhile, divide by size into “greater” above and “lesser” below. The disenfranchised smaller birds then split again, but this time by explicit human judgment of worth into “more generous” above (“want to be reclaimed and manned for fowling”) and “more cowardly and sluggish, or else indocile” below, and “therefore by our falconers neglected and permitted to live at large.” (I can’t help remarking that current sensibilities would probably grant higher status to these indocile forms for their wit in avoiding human servitude.) These less honored indocile forms then split again, this time by size, into greater above and lesser below. Finally, and introducing the new twist of favored geography for a final division, the lesser indociles split into preferred denizens of Europe above and less well regarded “exotics” below.
Moving up to the “more generous” category of lesser diurnal forms, a final division, citing the venerable principle of “more is better” in purest form, prefers the “long wing’d” above to the “short wing’d” below. Meanwhile, the truly topmost category of large, diurnal, and rapacious birds undergoes its final split to produce an overall “winner”—as the “more generous” eagles above vanquish the “cowardly and sluggish” vultures below. God bless America, and look out for them buzzards.
Moving from the specific social prejudices in Ray’s key (based on size, sex, location, and gaudiness) into deeper biases within any such supposedly objective account, we must also consider the very choice of dichotomy itself as a basis of division—one of the primary “tribal idols” in Bacon’s analysis of our cognitive preferences. Much of the world comes to us as continua, or as other complex, and far more than two-valued, series of reasonably discrete states. We do construct useful simplifications when we force this complexity into a simple system of successive dichotomous branchings—for such sequential ordering does resonate with our mind’s capacity to grasp a structure within multifarious and hierarchical systems. But what truer or more insightful ways of classification do we miss when we invoke this almost automatic mental scheme without pressing ourselves to consider less congenial, but perhaps more rewarding, alternatives?
Figure 19.
Figure 20.
Oddly enough, and to expose my own parochialism, I used to think that dichotomous keying had been invented by biologists for displaying Linnaean systems in the clearest possible way. After all, I learned the rules and techniques for constructing dichotomous keys in my basic undergraduate biology course, and, as mentioned above, botanists and zoologists have been using such keys for centuries. But, in fact, dichotomous keying represents one of our oldest and most general cognitive inventions, used for ages, and across all disciplines, for organizing complex systems of information. In fact, centuries before anyone ever thought about an empirically based classification of organisms, medieval schoolmen, following Saint Thomas and Aristotelian logic, used dichotomous keys as their primary device for displaying the conceptual structure of any classification. (And since classification is the primary Aristotelian technique for understanding causes, this scheme of analysis attains a maximal generality of potential use.)
For example, in 1586, a century before Ray constructed his dichotomous key for the classification of birds, the French jurist Nicholas Abraham published a schoolboy’s textbook on logic and ethics. He presented the guts of his classification as a strictly dichotomous key, exactly in the same form that Ray would use one hundred years later.10 He also follows Ray’s (presumably unconscious) practice of placing the favored category above the less distinguished choice at each division (figure 21)—even though the basic geometry of such branches cannot specify any “above” or “below.” For his primary separation, Abraham divides the basis of ethical judgments into Mentis (of the mind) above and Moris (by custom) below, for rational decisions trump social conventions. Figure 21 then shows subsequent divisions of the preferred mental category. Reasons of the mind divide into Sapientia (by wisdom) above and Prudentia (by discretion) below, as reason beats out emotion or convenience. Sapientia makes a final division into Intelligentia (by understanding) above and Scientia (by knowledge) below, as a well-wrought abstract argument triumphs over a factually forced decision. Prudentia then undergoes its final division into Bono consulatio (by good deliberation) above and Sagacitas (by the acuteness of personal decision) below, as the agreement of a collectivity trumps the uncertainty of a personal judgment, however wise the individual.
Figure 21.
But to show the depth and venerability of this primal idol of our tribe, consider the following example from the very heart of medieval scholastic traditions, as revealed in a marginal annotation in the oldest book of my personal collection. I own a lovely copy of Saint Thomas Aquinas’s commentaries on Aristotle—about as canonical as you can get for the genre, with the greatest of all medieval scholars explicating the greatest of all classical gurus—published in Cologne in 1487 or 1488, just a generation after Gutenberg’s invention of the printing press.
If I may venture an almost embarrassingly gushy personal comment, I can hardly begin to explain my pleasure in studying this text from the earliest days of the greatest invention in the history of intellectual life. (Book collectors refer to all volumes published before 1500 as “incunabulae,” literally, from the cradle.) For my copy has been so extensively annotated, in an ancient style of handwriting that cannot differ much from the date of the publication itself, that the inked additions often double the total verbiage. I do not have a particularly active imagination, and therefore fail to enjoy the pleasures felt by some people who can observe the ruin of a single column from a classical Greek temple, and then conjure up, in their mind’s eye, not only the entire edifice, but also the practices and feelings of the original utilizers and inhabitants. But, somehow, I can so proceed with the annotations in this book because they are so extensive, written in schvartz (or sometimes in the red ink of rubrication), and therefore preserved in their totality. I can see a late-fifteenth-century consumer, a university student or an aspiring cleric perhaps, sitting by his nighttime candle (for this book preserves wax drippings on several pages), trying to puzzle out the logic of the great masters, and quickly writing down his epitomes, lest he forget.
The diligent annotator of Aristotle’s De Anima included several branching keys among his marginalia. They do not always follow a purely dichotomous pattern, as some divide their principal subjects into three or five subcategories. But strictly dichotomous keys, moving from left to right just as Abraham’s for ethics in 1586 and Ray’s for birds in 1678, abound. One example, written into book two of Saint Thomas’s commentary on Aristotle’s De Anima, intrigued me to the point of virtually proving a case for strong human inclinations toward sequential dichotomy, or successive divisions into pairs, as a preferred mental device for classifying complex systems. For, in this case, Aristotle??
?s text suggests either a single continuum with three ordered categories, or a dichotomous tree with a primary division of two, and a second division of just one of these subcategories into two further groups, again for a total of three, as Aristotle’s formulation clearly requires. The annotator of my copy opted for the dichotomous tree with two divisions as his device for generating the three categories.
Aristotle here discusses the various categories of our intellectus, or understanding. Saint Thomas points out that our understanding can manifest itself either ad actu (by action or impulse) or potentia (by potential).11 He then focuses his discussion on the modes by which mere potential can lead to action (ducit de pona ad actu). Saint Thomas first presents the single continuum in three stages: intell[e]c[t]us e[st] in triplici dispo[sition]e (the intellect is arrayed into three categories). (I have resupplied the missing letters in brackets, as the printer relies heavily on abbreviations as explained in the foregoing footnote.) But he then, immediately thereafter, suggests the alternate classification by dichotomy, with two splits. After the primary division into action and potential, the category of action remains discrete and divides no further. But the second category of pona (potentia, or potential) must then undergo a second dichotomous split: ille mod [us] s[u] bdividit i[n] duos (this mode divides into two), with a more easily activated category called propinqua, or nearby; and a less mobilizable division called remota, or remote.
Thus we may conceptualize the entire system either as a continuum of three states from most immediate to most distant (direct action, potential for easily inspired action, and potential more difficult to mobilize); or as a double dichotomous division, first into action and potential, with action then undivided, and potential further split into greater and lesser propensity for recruitment. Faced with these clear choices, probably the two most fundamental alternatives within our cognitive capacities (a single smooth continuum versus a set of successive dichotomous divisions), note how our diligent student and writer of marginalia opts for dichotomy. In a beautiful hand, our annotator has made his choice and drawn a dichotomous key, moving from left to right through two divisions, first into actu above (by action) and pona below (potentia, or by potential), with pona subdivided into ppinq (propinqua, or nearby, in the sense of easily activated) and remota (or distant, in the sense of disinclined).
As a little footnote too sweet to omit (although irrelevant to the subject of classification by dichotomy), our diligent student shows his humanity in a conventional manner across the centuries, from his Latin to our barhopping. As a final comment to his discussion, Saint Thomas notes that easily inspired actions in the category of propinqua can be suppressed by circumstances external to the character of the intellect itself. He specifically mentions two: dolor vel ebrietas (sadness or drunkenness). Our commentator dutifully records one of these impediments in his elegant hand on the very next page: Ebrietas impe-dia t scientia[m] (drunkenness impedes knowledge). I only mention this note because, at the very end of the book, after the printed AMEN and several hundred pages of copious annotations, our student finally grants himself a break, medieval style, by writing: Claudite jam rimos pueri sat prata biberunt, or (roughly): “Enough, boys, now let us conclude these thorough investigations and go to meadows to drink.” So much for intellectual potential, at least for a celebratory conclusion!
To cite a closing example, from the celebrated French physician and surgeon Ambroise Paré (1510–1590), and illustrating a case where the tribal idol of dichotomy imposed a false solution that impeded the development of medicine for centuries, the tentacles of this cognitive bias extended far beyond simple branchings in sequential keys. More-complex classifications could also be constructed by allowing several dichotomous alternatives to interpenetrate, and by then listing all possible permutations to enumerate the potential categories. The old medical theory of four bodily humors may not, at first, reveal any dichotomous basis. But, in fact, this fallacious system roots its taxonomy in the intersections of two dichotomous divisions between hot versus cold, and wet versus dry.
Under this theory, bodily health requires a balance of four distinct principles or humors (literally liquids)—blood, phlegm, choler, and melancholy. An emphasis upon one of the four leads, successively, to distinctive temperaments or styles of personality that continue, at least descriptively, to designate certain human propensities: sanguine, phlegmatic, choleric, and melancholic. If the humors get more seriously out of whack, then bodily malfunction will ensue—not from the invasion of any foreign agent (as in the later germ theory of disease) or from failure to take in essential nutrients (at least not immediately, but only through their influence on the production of humors), but directly from the internal imbalance itself. The remedy for disease, with illness thus construed as an imbalance among humors, must focus upon techniques for reducing overactive humors and restoring the weakened components. The theory of humors therefore inspired centuries of belief in a large array of procedures now regarded as entirely ineffective, if not barbaric—including bloodletting (to reduce the sanguine humor), sweating, purging, vomiting, et cetera.
But why, in the absence of any direct evidence for the existence of such liquids, did classical medicine insist so strongly upon four, and only four, humors? The standard solution, so congenial to two modes of thought that flourished before the Scientific Revolution and then died with its success, invoked, first, a correspondence between the microcosm of the human body and the macrocosm of the universe; and, second, the four permitted categories of a double dichotomy to define the corresponding divisions of both the microcosm and macrocosm. Just as four humors balanced the microcosm (blood, phlegm, choler, and melancholy), so too did four elements (air, water, fire, and earth) build the macrocosm. In each case, these four represent all possible combinations of the two primal dichotomies for material things: hot versus cold, and wet versus dry.
Paré’s chart (from my 1614 edition of his collected works, figures 22 and 23) lays out all aspects of this system explicitly: blood corresponds with air and represents the hot and wet substance; phlegm represents water, the cold and wet element; choler corresponds with fire, hot and dry; whereas melancholy represents earth, cold and dry. (Note how the temperaments arise from this conception, with the hot and wet person as sanguine, or optimistic and level-headed; the cold and wet person as phlegmatic, or slow to act; the hot and dry person as choleric, or quick to anger; and the cold and dry person as melancholy, or just plain sad.)
I have, in this prolonged discussion, only addressed one helpful insight (albeit the primary benefit, in my judgment) that scientists could gain from their colleagues in the humanities: exposing the myth of objectivity by a positive acknowledgment (not a cynical and despairing shrug for inevitable loss) of the mental quirks and social influences upon all factual study of the natural world—for honest recognition can only breed self-awareness and greater practical sophistication about the mental processes that scientists must use to reach their accurate conclusions. But I also wish to mention, more briefly, two additional factors, also better known, appreciated, and more widely studied within the humanities—valuable stratagems of the fox for adding some excellent and fully honorable nuances of real effectiveness to the overly restricted or inadequately examined practices of science.
Figure 22.
Figure 23.
Humanists, for my second point, rightly stress the virtues and felicities of stylistic writing, not as a mere frill or foppish attribute, but as a primary aid to attention and understanding. Scientists, on the other hand, and as a virtual badge of membership for admission to our professional club, tend to assert that although brevity and clarity should certainly be fostered, the nurturing of verbal style, as an issue of form rather than substance, plays no role in the study of material reality.
In fact, this explicit denial of importance to modes of communication has, unfortunately, engendered a more than merely mild form of philistinism among many scientists who not only view verbal skills as unimp
ortant, but actually discount any fortuitous stylistic acumen among their colleagues as an irrelevant snare, casting suspicion upon the writer’s capacity for objectivity in presenting the data of nature. In an almost perverse manner, inarticulateness almost becomes a virtue as a collateral sign of proper attention to nature’s raw empirics versus distilled human presentation thereof. (And yet, to cite a pair of ironies, proving that the best scientists have always understood the value of both assiduous data gathering and elegant communication, John Ray composed even his denial of the importance of good writing—see the quotation on page 47—in his characteristically excellent prose. And the famous motto “le style c’est l’homme même” (style makes the man) did not emanate from a leader among the literati, but from the finest naturalist of eighteenth-century France, and a great writer as well—Georges Leclerc Buffon, whose forty-four-volume Histoire naturelle, equally admired for both style and content, became the first great encyclopedia of modern approaches to the study of nature.
Because we have cut ourselves off from scholars in the humanities who pay closer attention to modes of communication, we have spun our own self-referential wheels and developed artificial standards and rules of writing that virtually guarantee the unreadability of scientific articles outside the clubhouse. Some of our conventions might also be called ludicrous in their utter failure to achieve a stated end, and in the guaranteed clunkiness of style thus engendered by rules that any good writer would immediately recognize as crippling. In my favorite example, scientists have trained themselves to write in the most unfelicitous of all English modes: the unrelenting passive voice. If you ask scientists for a rationale, they will reply with the two standard defenses: economy of presentation and objectivity of statement. Neither, in fact, makes any sense. Sentences in the passive voice tend to be longer than the corresponding active statement, while immodesty and personal glorification can proceed just as readily without the dreaded “I.” Which of the following do you prefer for brevity, modesty, and just plain felicity: “The discovery that was made was no doubt the most significant advance of our times”; or “I have discovered a procedure to solve the persistent problem ...”?