We came from the Level Above Human in distant space and we have now exited the bodies that we were wearing for our earthly task, to return to the world from whence we came—task completed. The distant space we refer to is what your religious literature would call the Kingdom of Heaven or the Kingdom of God. We came for the purpose of offering a doorway to the Kingdom of God at the end of this civilization, the end of this millenium.

  DOUSING DIMINUTIVE DENNIS’S DEBATE (DDDD = 2000)

  In 1697, on the day appointed for repenting mistakes in judgment at Salem, Samuel Sewall of Boston stood silently in old South Church, as the rector read his confession of error aloud and to the public. He alone among judges of the falsely accused “witches” of Salem had the courage to undergo such public chastisement. Four years later, the same Samuel Sewall made a most joyful noise unto the Lord—and at a particularly auspicious moment. He hired four trumpeters to herald, as he wrote, the “entrance of the 18th century” by sounding a blast on Boston Common right at daybreak. He also paid the town crier to read out his “verses upon the New Century.” The opening stanzas seem especially poignant today, the first for its relevance (I am writing this essay on a bleak January day in Boston, and the temperature outside is –2° Fahrenheit), and the second for a superannuated paternalism that highlights both the admirable and the dubious in our history:

  Once more! Our God vouchsafe to shine:

  Correct the coldness of our clime.

  Make haste with thy impartial light,

  and terminate this long dark night.

  Give the Indians eyes to see

  The light of life, and set them free.

  So men shall God in Christ adore,

  And worship idols vain, no more.

  I do not raise this issue either to embarrass the good judge for his tragic error, or to praise his commendable courage, but for an aspect of the tale that may seem peripheral to Sewall’s intent, yet nevertheless looms large as we approach the millennium destined to climax our current decade. Sewall hired his trumpeters for January 1, 170l, not January 1, 1700—and he therefore made an explicit decision in a debate that the cusp of his new century had kindled, and that has increased mightily at every similar transition since (see my main source for much of this section, the marvelously meticulous history of fins de siècle—Century’s End by Hillel Schwartz). When do centuries end? At the termination of years marked ’99 (as common sensibility suggests), or at the close of years marked ’00 (as the narrow logic of a particular system dictates)?

  The debate is already more intense than ever, though we still have a little time before our own forthcoming transition, and for two obvious reasons. First—O cursèd spite—our disjointed times, and our burgeoning press, provide greatly enhanced opportunity for rehearsal of such narrishkeit ad nauseam; do we not feast upon trivialities to divert attention from the truly portentous issues that engulf us? Second, this time around really does count as the ultimate blockbuster: for this is the millennium,* the great and indubitable unicum of any living observer (though a few trees, and maybe a fungus or two, but not a single animal, were born before the year 1000 and have therefore been through it before).

  Condemned in Hell (1499–1500), Luca Signorelli. Fresco. (illustration credit 2.1)

  On December 26, 1993, The New York Times ran a piece to bury the Christmas buying orgy and welcome the new year. This article, on commercial gear-up for the century’s end, began by noting: “There is money to be made on the millennium … In 999 feelings of gloom ran rampant. What the doomsayers may have lacked was an instinct for mass marketing.” The commercial cascade of this millennium is now in full swing: in journals, date books, the inevitable coffee mugs and T-shirts, and a thousand other products being flogged by the full gamut, from New Age fruitcakes of the counterculture, to hard-line apocalyptic visionaries at the Christian fringe, to a thicket of ordinary guys out to make a buck. The article even tells of a consulting firm explicitly established to help others market the millennium—so we are already witnessing the fractal recursion that might be called metaprofiteering, or growing clams of advice in the clam beds of your advisees’ potential profits.

  I am truly sorry that I cannot, in current parlance, “get with the program.” I feel compelled to mention two tiny little difficulties that could act as dampers upon the universal ballyhoo. First, millennia are not transitions at the ends of thousand-year periods, but particular periods lasting one thousand years; so I’m not convinced that we even have the name right (but see Part 1 for a resolution of this issue). Second, if we insist on a celebration (as we should) no matter what name be given, we had better decide when to celebrate. I devote this section to explaining why the second issue cannot be resolved—a situation that should be viewed as enlightening, not depressing. For just as Tennyson taught us to prefer love lost over love unexperienced, it is better to not know and to know why one can’t know, than to be clueless about why the hell so many people are so agitated about 1999 versus 2000 for the last year before the great divide. At least when you grasp the conflicting, legitimate, and unresolvable claims of both sides, you can then celebrate both alternatives with equanimity—or neither (with informed self-righteousness) if your persona be sour, or smug.

  As a man of below average stature myself, I am delighted to report that the source of our infernal trouble about the ends of centuries may be traced to a sixth-century monk named Dionysius Exiguus, or (literally) Dennis the Short. Instructed to prepare a chronology for Pope St. John I, Little Dennis, following a standard practice, began countable years with the foundation of Rome. But, neatly balancing his secular and sacred allegiances, Dionysius then divided time again at Christ’s appearance. He reckoned Jesus’ birth at December 25, near the end of year 753 A.U.C. (ab urbe condita, or “from the foundation of the city,” that is, of Rome). Dionysius then restarted time just a few days later on January 1, 754 A.U.C.—not Christ’s birth, but the feast of the circumcision on his eighth day of life, and also, not coincidentally, New Year’s Day in Roman and Latin Christian calendars.

  Dionysius’s legacy has provided little but trouble. First of all, as discussed in more detail in Part 1, he didn’t even get the date right, for Herod died in 750 A.U.C. Therefore, if Jesus and Herod overlapped (and the gospels will have to be drastically revised if they did not), then Jesus must have been born in 4 B.C. or earlier—thus granting the bearer of time’s title several years of life before the inception of his own era!

  (I do, in any case, relish the oxymoron of Jesus born at least four years before Jesus. For various reasons, including resolution of this paradox and a desire for greater inclusivity in a diverse world containing lots of non-Christian folks, the B.C. terminology has been losing popularity of late. Some sources now use B.C.E.—for “before the Christian era” if they wish to tone down the oxymoron, or “before the common era” if they care about inclusivity. Scientists, recognizing absolutely nothing special about the B.C.–A.D. transition, tend to use B.P., or “before the present,” as in 32,410 B.P. for the oldest radiocarbon dated Paleolithic cave painting from Chauvet in France—a good way to acknowledge the anachronistic irrelevance of Jesus’ birth for an earlier cave artist. In this system, 1950 counts as the present moment—an arbitrary decision that won’t cause much trouble for a while longer, but will eventually seem as capricious as the B.C.–A.D. cusp. At least we can all remember the cusp. But why should a scientist, two hundred years from now, honor 1950 in this manner?)

  But Dennis’s misdate of Jesus counts as a mere peccadillo compared with the consequences of his second dubious decision. He started time again on January 1, 754 A.U.C.—and he called this date January 1 of year one A.D. (Anno Domini, or “in the year of the Lord”)—not year zero (which would, in retrospect, have spared us ever so much trouble!). In short, Dennis did not begin time at zero, thus discombobulating our usual notions of counting. During the year that Jesus was one year old, the time system that supposedly started with his birth was two years old.
(Babies are zero years old until their first birthday; modern time was already one year old at its inception.)

  We should not, however, be overly harsh on poor Dennis—for this most inconvenient choice could not have been avoided, and no blame can be laid on his doorstep (if monastical cubicles even included such an architectural feature for absorbing metaphorical abuse). Western mathematics in the sixth century had not yet developed a concept of zero to serve as a proper place marker across Dennis’s great divide. The Egyptians had used a zero, but only sporadically and inconsistently. The Chinese had no explicit numeral for zero, but their abacus implied the concept. The Mayans did develop a symbol for zero, but could not use the concept in a fully systematic way in their calculations (not to mention that Dennis knew absolutely nothing either of them or their entire hemisphere). Hindu and Arabic mathematicians devised the concept of zero in a complete and usable way—but not, apparently, before the late eighth or early ninth century—and Europe borrowed the idea from this source. Ironically, another figure in our narrative, the millennial Pope (and great scholar) Sylvester II, who reigned as pontiff from 999 to 1003, became the major exponent of zero, and our modern Arabic system of numbers—but far too late for Dennis (and for surcease from the perpetual confusion that has reigned ever since).

  The problem of centuries arises from Dennis’s unfortunate, if historically inevitable, decision to start at one, rather than zero—and for no other reason! If we insist that all decades must have ten years, and all centuries one hundred years, then year 10 belongs to the first decade—and, sad to say, year 100 must remain in the first century. Thenceforward, the issue never goes away. Every year with a ’00 must count as the hundredth and final year of its century—no matter what common sensibility might prefer: 1900 went with all 1800 years to form the nineteenth century; and 2000 must be the completing year of the twentieth century, not the inception of the next millennium. Or so the pure logic of Dennis’s system dictates. If our shortsighted monk had only begun with a year zero, then logic and sensibility would coincide, and the wild millennial bells could ring forth but once and resoundingly at the beginning of January 1, 2000. But he didn’t.

  Since logic and sensibility do not coincide, and since both have legitimate claims upon our decision, the great and recurring debate about century boundaries simply cannot be resolved. Some questions have answers because obtainable information decrees a particular conclusion. The earth does revolve around the sun, and evolution does regulate the history of life. Some questions have no answers because we cannot get the required information. (I doubt that we will ever identify Jack the Ripper with certainty.) Many of our most intense debates, however, are not resolvable by information of any kind, but arise from conflicts in values or modes of analysis. (Shall we permit abortion, and in what circumstances? Does God exist?) A subset of these unresolvable debates—ultimately trivial, but capable of provoking great agitation, and thus the most frustrating of all—have no answers because they are about words and systems, rather than things. Phenomena of the world (that is, “things”) therefore have no bearing upon potential solutions. The century debate lies within this vexatious category.

  Detail from Condemned in Hell (1499–1500), Luca Signorelli. (illustration credit 2.2)

  The logic of Dionysius’s arbitrary system dictates one result—that centuries change between ’00 and ’01 years. Common sensibility leads us to the opposite conclusion: We want to match transitions with the extent or intensity of apparent sensual change, and 1999 to 2000 just looks more definitive than 2000 to 2001. So we set our millennial boundary at the change in all four positions, rather than the mere increment of 1 to the last position. (I refer to this side as “common sensibility” rather than “common sense” because support invokes issues of aesthetics and feeling, rather than logical reasoning.)

  One might argue that humans, as creatures of reason, should be willing to subjugate sensibility to logic; but we are, just as much, creatures of feeling. And so the debate has progressed at every go-round. Hillel Schwartz, for example, cites two letters to newspapers, written from the camp of common sensibility in 1900: “I defy the most bigoted precisian to work up an enthusiasm over the year 1901, when we will already have had twelve months’ experience of the 1900’s.” “The centurial figures are the symbol, and the only symbol, of the centuries. Once every hundred years there is a change in the symbol, and this great secular event is of startling prominence. What more natural than to bring the century into harmony with its only visible mark?” Since these strong expressions precede the invention of the automobile odometer, we cannot attribute current preferences for honoring 2000 to the most obvious device that now concentrates our attention upon the numerical side of millennial transitions. (My dad once took me and my brother on a late night ten-mile ride around Flushing—just so we could see the odometer go from 9999 to 10000—rather than giving him the pleasure on his solo trip to the office next morning. I’ll bet that half the readers of this essay could cite a similar experience.)

  I do so love human foibles; what else can keep us laughing (as we must) in this tough world of ours. The more trivial an issue, and the more unresolvable, so does the heat of debate, and the assurance of absolute righteousness, intensify on each side. (Just consider professorial arguments over parking places at university lots.) The same clamor arises every hundred years. An English participant in the debate of 1800 versus 1801 wrote of “the idle controversy, which has of late convulsed so many brains, respecting the commencement of the current century.” On January 1, 1801, a poem in the Connecticut Courant pronounced a plague on both houses (but sided with Dionysius):

  Precisely twelve o’clock last night,

  The Eighteenth Century took its flight.

  Full many a calculating head

  Has rack’d its brain, its ink has shed,

  To prove by metaphysics fine

  A hundred means but ninety-nine;

  While at their wisdom others wonder’d

  But took one more to make a hundred.

  The same smugness reappeared a century later. The New York Times, with anticipatory diplomacy, wrote in 1896: “As the present century draws to its close we see looming not very far ahead the venerable dispute which reappears every hundred years—viz: When does the next century begin?… There can be no doubt that one person may hold that the next century begins on the 1st of January, 1900, and another that it begins on the 1st of January, 1901, and yet both of them be in full possession of their faculties.” But a German commentator remarked: “In my life I have seen many people do battle over many things, but over few things with such fanaticism as over the academic question of when the century would end.… Each of the two parties produced for its side the trickiest of calculations and maintained at the same time that it was the simplest matter in the world, one that any child should understand.”

  You ask where I stand? Well, publicly of course I take no position because, as I have just stated, the issue is unresolvable: for each side has a fully consistent argument within the confines of different but equally defensible systems. But privately, just between you and me, well, let’s put it this way: I know a mentally handicapped young man who also happens to be a prodigy in day-date calculation. (He can, instantaneously, give the day of the week for any date, thousands of years, past or future—see Part 3.) He is fully aware of the great century debate, for nothing could interest him more. I asked him recently whether the millennium comes in 2000 or 2001—and he responded unhesitatingly: “In 2000. The first decade had only nine years.”

  What an elegant solution, and why not? After all, no one then living had any idea whether they were toiling in year zero or year one—or whether their first decade had nine or ten years, their first century ninety-nine or one hundred. The B.C.–A.D. system wasn’t invented until the sixth century and wasn’t generally accepted in Europe until much later. So why don’t we just proclaim that the first century had ninety-nine years—since not a soul then living eith
er knew or cared about the anachronism that would later be heaped upon all the years of their lives? Centuries can then turn when common sensibility desires, and we underscore Dionysius’s blessed arbitrariness with a caprice, a device of our own that marries the warring camps. Neat, except that I think people want to argue passionately about trivial unresolvabilities—lest they be compelled to invest such rambunctious energy in real battles that might kill somebody.

  Detail from The Last Judgement (1536–1541), Michelangelo. (illustration credit 2.3)

  What else might we salvage from rehearsing the history of a debate without an answer? Ironically, such arguments contain the possibility for a precious sociological insight: Since no answer can arise from either the factuality of nature or the internal necessities of human logic, changing viewpoints provide “pure” trajectories of evolving human attitudes—and we can therefore map societal trends without impediments of such confusing factors as definite truth.