A, B, C: Three Short Novels
*2 Modernist experimental French writers in the twentieth century, such as Louis-Ferdinand Céline and Jean Genet, used largely the not quite four-thousand word vocabulary—with bits of added slang—that the seventeenth-century writers Jean Racine and Pierre Corneille used, three hundred years before. This is not the case with, say, American modernists such as Hemingway and Faulkner on the one hand and our seventeenth-century English writers John Donne and John Milton on the other. The difference between the two traditions, French and English, is an effect of the French National Academy in the one and the lack of the same in England, America, and Australia.
*3 “Critical Methods/Speculative Fiction”: initially I had written it in the Autumn of ’69 and delivered it to a group of enthusiastic science-fiction fans who met in a house in the beautiful Berkeley Hills. That meeting was hosted by a member of the family who made Tanqueray Gin—surely a resonance with what I will shortly write. At that year’s MLA, I read a version cut by half. The complete text was published in Quark/1 (Paperback Library, NYC, 1970), edited by Marilyn Hacker and myself. Today you can find it in The Jewel-Hinged Jaw, a revised edition of which is available from Wesleyan University Press (Middletown, 2010). For the record, that ’69 talk is among the last times I used the term “speculative fiction” before returning to the phrase, adequate for any critical use I have found myself in need of since: “science fiction.” As far as I can see, the basic meaning of “speculative fiction” is: “whatever science fiction I, the speaker, happen to approve of at ten o’clock Wednesday morning or at whatever moment I use the term,” which makes it a very slippery shifter and too vague to sustain a useful critical life in any analytical discussion. I have not used it, except more or less ironically, and then rarely, for forty-five years, though even today I run across people claiming it’s my “preferred term.” It’s not.
*4 With some eight thousand-plus others, our own turning galaxy arcs toward the Great Attractor in our supercluster of the galactic net, a cluster containing the Virgo galaxy cluster at the end of one peninsula of galaxies off the parent cluster, while ours is at the end of another, next to it. Till recently, we thought we were part of Virgo. But we’re not. Both our galaxy—the Milky Way—and the Virgo cluster are on short chains of galaxies that feed into the major supercluster (more like an unraveled ball of string than a swarm of bees), which is about a hundred times larger than astronomers thought even a few decades ago. Only this year have they started calling that larger structure Laniakea—Hawaiian for “Immeasurable Heaven.” Now it’s been measured and is currently among the biggest structures the descendants of our million-times great-grandmother (or great-aunt) “Lucy” and her many-times-grandson (or great-nephew), “Red Clay Man” (the meaning of the Hebrew name “Adam”; which tells not only what they thought he looked like but what they thought he was made of) have individuated, mapped, and named—though Lucy and Adam both probably saw fragments of it when they looked up at the naked night, as we can today. It’s about a hundred million light-years across. But, about that size, many more link to it, to make the gravity-enchained galactic net. Google Laniakea or Perseus-Pisces or the Great Attractor or the Shapley Supercluster; or the Axis of Evil or the Bright Spot—all galaxy markers in our expanding map of the multiverse. All are impressive.
For all it doesn’t tell us about dark matter and dark energy, light carries an awesome amount of information throughout the multiverse, whether from the edges of the visible or from the leaf by my shoe sole at a puddle’s edge, information that links through evolution to why and how so many creatures—including most humans—have eyes.
*5 They burn up, melt, or both, and finally, with enough heat, defuse as plasma, and under increased radiation even their atoms may eventually shatter.
*6 Those discursive structures stabilize our metaphysical assumptions that, as Derrida remarked, we are never outside of and are most deeply enmeshed in precisely when we are critiquing someone else’s.
There is a story, possibly apocryphal, about the philosopher Ludwig Wittgenstein, who was wandering one day over the lawns of Cambridge and looking at the sky, when one of his students saw him. “Professor Wittgenstein, are you all right? What are you doing…?”
The philosopher looked down and saw the student. (The novelist in me at this point always assumes Wittgenstein blinked.) “I’m trying to understand,” said the perplexed philosopher, “why, when the earth is turning and the sun is—relatively—in one place in the sky, it feels and looks as if the earth is still and the sun is moving around it.”
“Well…” said the student, perplexed now by the philosopher’s perplexity, “it’s because, I suppose, it just feels and looks that way when the earth is moving and the sun is standing still.”
“But if that’s the case,” replied the philosopher, “what would it look and feel like if the earth were actually still and the sun was actually moving.” And on that question, Wittgenstein turned, looked up again, and wandered off across the grass, leaving a very perplexed young man, now looking after him, now squinting toward the sun. Your words and mine evoke—rather than carry—approximate meaning, already there at their destination, meanings that the order of words alone will rearrange and that must be interpreted further by probabilistic approximation to mean anything at all. It is only the effect that feels as if they carry actual meanings from speaker or writer to hearer or reader. But if that’s the case, what would be the effect if they felt as if they only evoked meanings already there by probabilistic approximation…?
Life is made up of lots of “experience puns,” with an “obvious explanation” and several “not so obvious ones.” Enlarging on this property was the basis for much of the work of the surrealist artists, such as Pavel Tchelichew, Max Ernst, and M. C. Escher.
Our metaphysics arises from assuming perceived resonances are causal even though we have no evidence for it, but without doing so we would be left with solipsism—itself a limit-case metaphysical assumption, but an assumption nevertheless. In short, we can either assume that stuff is there—or that it isn’t. (Maybe it’s something else, energy, idea, or pure God…) We have no logical proof for any of them. What we have is effects that seem to make us comfortable or uncomfortable, but comfort and discomfort, remember, are also effects. (We can work directly with the brain to change them, both temporarily or permanently.) We seem to be most comfortable assuming the very complex world we live in is there, and that all the complex things that have developed in it over the last five billion years to deal with are, in fact, the case—and many of us feel even more comfortable when we can untangle contradictions in what appears obvious by means of other patterns we have been able to see in other places, with the aid of other techniques. (It’s called science.) Explore it, play, have fun, and try to learn and understand, even adjust—but is it really worth fighting with it to make yourself miserable about the way other folks want to explore, play, and learn? And most of us seem to feel better when we can help people who are suffering—because we all suffer.
*7 Because the situations are so different—situations which always entail a worldscape with conditions unique to it—that individuals, pairs, smaller or larger communities of living creatures, find ourselves moving through or settling down in, it is not particularly efficient to wire in one set of responses to all situations. But it has been efficient since before the advent of language to wire in the ability to learn to adjust to different conditions, both by establishing habits and habit-systems and through more thoughtful responses; both always involve actions and inactions. To the extent these are always patterns, they are what rhetoric cuts the world up into and discourses stabilize, but have had very little to do—at least up until recently (say, since the development of writing)—with our understanding how the “process” works. Today, in the context of our hugely expanded world population, even over the last five hundred years—as the plurality of our cultures increasingly becomes the condition within which we must negotiate—our survival would appea
r to hinge more and more on understanding the process. Pollution is rampant. The climate has changed and not for the better. Because, as part of our cultures, we have already made such changes, along with our population expansion, in our so-varied worldscapes—the atmosphere, the ocean, the mined hills and fishable rivers, the arable lands and the slashed-back rain forests—it is imperative we do something about it or as a species we will suffer far worse consequences than we have already started to. Types of bees, certain species of starfish, as well as tigers and wolves—and dozens of fish, birds, and butterflies—have become endangered species over the last three decades. Our own human population numbers are out of hand and the inequities among us controlled by stupidity or mistaken for reason are only going to do us and the planet in. We need to bring the population down, slowly, over generations, and with consent, though genocides, direct and indirect—both of which seed our own destruction—become more and more prevalent.
*8 The evolutionary journey from blindness to the ability to visually recognize individuals and places is as amazing as the journey from deafness and muteness to spoken language, if not more so. (And neither journey has been completed. Consider the importance of the overlap in the past five thousand years.) But it couldn’t have happened if we—and I include all of humankind’s forerunners—hadn’t first developed our ability to recognize groups of us and individuals among them by smell, and all of which was innately entailed in the sexual imagination and—if people will let it be—still is.
*9 The indirect nature of communication, which we so easily mistake for direct exchange (because it is all we know), especially at the indistinct and misunderstood level of discourse, is the seat from which cultural misunderstandings rise up to rage and shake our fists against an uncomprehending Other. The understandings required are best gained by exposure and participation in the conditions of life (now covered—though clumsily—by the notion of social construction), rather than through observations and explanations of them. Lacking that, the best textual aid is description of the conditions in the form the anthropologist Clifford Geertz called “thick description,” where the scribe endeavors to avoid imposing her or his own notions of what’s important and what’s not. But even this hurls us into the realm of chance. Experience is still all important. But language must organize experience before experience can reorganize language. If that was not the case, there would be nothing or little to reorganize.
*10 Even communication of affection and the acknowledgment of the existence of others through touches and nuzzlings and lickings and caresses work the same way. Smell and taste are only slightly more direct, because they start out by depending on the shape of molecules that actually originate with the other, instead of wave functions that are not as material but more process, such as sound or light. But only slightly more so. And once within the thinking-experiencing-interpreting-feeling part of any creature (the brain), all are wave functions again. Smell is still our most intense memory prod. We fight it more and more; we use, it less and less. But before you die, watch it save your—and maybe someone else’s—life at least three times, i.e., it gives the group a survival edge, which is only one piece of evidence for its usefulness and efficiency. To have evolved, it has to have others. Brain structures have built up to take care of “meanings” at the level of the word, of the phrase, of the sentence, of the topic, and any kind of physical pressure in general for every other stage of interpretation. Primates—not to mention mammals in toto—learn them mostly by exposure and some evolutionary pre-wiring. But learning must precede the “reception” of communication of what has been learned, and in all individuals the associational patterns that comprise learning occur at slightly different times and at different positions in the world and thus the learning process itself is different for each one of us, particularly today among us humans; which is to say, communication by sound is primarily a vibratory stimulation of something already there, not a material (or ideal) passage of something that is not.
This both is and is why information cannot pass directly between living creatures of any biological complexity. Information is the indirect evocation/creation of congruence, of pattern.
This is what discourse is and controls.
From one side, language can only be explained communally. From another, it can only be experienced individually. That’s because “community” and “individual” are abstractions that have been extremely efficient for negotiating lots of problems since writing came along. (Before that, we have no way to know for sure.) But as our population has grown so much bigger in (arbitrarily) the last two hundred fifty years, it’s begun to look more and more efficient to expand “community” from something tribal to something far more nuanced and ecologically inclusive. Some people see this as a return to tribalism. But it’s just as much a turn to science. As for “individual,” I can even entertain an argument that holds that “logos/discourse” was initially a metaphor put forward by philosophers such as Heraclitus and the Mesopotamian rabbis (which means “teachers”) to help stabilize the notion that language is never “our own,” but was always from another, at a time when there was not the technological or sociological support for a model that was, nevertheless, in its overall form, accessible to anyone who had ever learned to speak a language other than the one she or he grew up with, and/or watched a child learn its “own.” Most of a century later, Plato called all this prelearning “remembrance” and speculated it came through reincarnation. I don’t believe that was a step in the right direction, other than to nudge thinkers to pay attention to history. But little or nothing that creatures who have evolved do or think has only one use. That’s another thing evolution assures. That’s what we mean when we say an adaptation is efficient.
*11 The German philosopher Arthur Schopenhauer first made a large portion of the reading public for philosophy aware of the mediated (that is, indirect) structure of sensory perception for humans. But the fact is, this is true for all creatures who have senses as well as for plants that seemed to be slowly developing something akin to them. Remember that the next time you take a walk in the woods. Yes, 95 percent of our genes are identical with chimpanzees. But 50 percent of them are identical with oak trees. We share genes with lizards, chickens, pond scum, mushrooms, and spiders, not to mention gnats, lichens, elephants, viruses, bacteria, nematodes, and the rest of life’s teeming species. That’s why we eat each other in so many directions; and it’s why a number of species, such as poisonous snakes and poisonous plants, have developed defenses to keep from being eaten. The fact that we share as many genes with everything that lives is one, but by no means the only, bit of evidence for our direct connections. And that creatures with ears and eyes and tactile feelings look, sound, and move as if they are alive in the world and care about being so—that is, they exist as subjects—is another; but, again, by no means the only or determining one. We live in a world constructed of a vast number of suggestions—and a relatively few explanations (relatively few because we only have the ones, however, we’ve been able so far to figure out, in which there are bound to be inaccuracies and incompletenesses). Many of the explanations contravene the suggestions. The French psychiatrist Jacques Lacan called these two very human orders the Imaginary and the Symbolic. Different cultures have different Imaginaries and different Symbolics. What science says as a larger philosophy, at least to me, is that this multiplicity is a negotiable condition of the world, accessible to language and its potential behaviors, not an ontological bedrock of the universe: an effect, an illusion if you like that can be explained. I would only add: however you want to talk about it, it damned well better be. If not, we’ve had it.
*12 At the New York Library Society on February 3, 1848, Poe had hoped for hundreds to support his new magazine, The Stylus. It was the same month in the same year in which France would erupt in a revolution that, for a few brief months, would result in universal male suffrage and the hope for even more reforms, and which, in the weeks following it, America wou
ld celebrate that victory almost as joyfully as Paris, with fireworks from Washington, D.C., to Pittsfield, Massachusetts, and where, at his Pittsfield home, The Arrowhead, Melville was rushing through Mardi and Redburn so he could get started on Moby-Dick. Initially he’d planned to have a happy ending, say some critics, but all too shortly, within the year, the advances of the Revolution of 1848 had been rescinded—and Moby-Dick (1851) was rewritten with the tragic conclusion we know today, possibly on some level a response to the great historical disappointment, suggests the critic C. L. R James (in his brilliant reading of the novel Mariners, Renegades, and Castaways, Herman Melville and the World We Live In [1952; reprint 1978]), written while James himself was “detained”—like Cervantes, like Thomas Paine, like Thoreau, like Gramsci—in James’s case on Ellis Island, in the first years of the 1950s.