Now that I had seen the Denali Fault and the pipeline there was, however, one final thing to see—and that required my driving yet another 4,000 miles back down south. So the following morning, and for the better part of the following week, I was on the road again: more caribou, more moose, and even a pair of bald eagles in the Rockies near Calgary. I crossed the American border in the windswept plains of northern Montana, zipped past the Minutemen nuclear-weapons silos that I thought had long ago been dismantled and grassed over; and, after a day spent meandering in the rain shadow of the mountains, came to the pretty university town of Bozeman and finally to my goal, by way of the imposing stone gateway into the first of America’s great national parks, Yellowstone.
THE PARK IS A PLACE of almost indescribable spectacle, rightly popular and in consequence frequently, especially in the high summer, more crowded than is good for it. The wildlife, the mountains, the lakes, and the geysers are all the very obvious lures for the hundreds of thousands who each season drive in through the park’s main gates. And these days there is a new reason: The widely publicized knowledge that Yellowstone Park sits on top of a potential supervolcano, whose eruption—at some unpredictable moment in the geological near term—will devastate nearly all of western America.
Most of Yellowstone is, in fact, the relic of a family of great volcanoes. There have been three periods of eruption, the first about 2 million years ago, the latest finishing around 600,000 years ago, with each spitting out, very violently, immeasurable quantities of lava and dust and ash. The last eruption left behind today’s caldera with its jagged-mountain edge; this is filled at its southeastern corner with the large body of water that is Lake Yellowstone, drained northward by the spectacular river of the same name. The last eruption, in other words, left behind the very reason for the existence of the national park.
In recent years seismologists and geochemists have been poring over the park’s geology and its character deep underground, and have found something that they long suspected but have not been able fully to prove until now—and it is that specific something that is drawing in the crowds. For it turns out that only a few thousand feet below the ground level of the park, the enormous body of molten magma that fed all the volcanoes of the past is not only still there—but, to use a metaphor of which the ancients would have approved, it is alive. Moreover, it is moving. It appears to be doing something that the imaginative would say is akin to breathing.
For recent measurements have shown that the bed of Lake Yellowstone, as well as the bottom of the river close to the point where it flows out of the lake, is rising and falling, slowly and rhythmically, year in, year out. It is as though a giant is slumbering beneath the lake, readying itself—for what? To awaken? To turn in its sleep? To snore? To stand suddenly and tear loose all that lies on top?
Such images are potent, and they have captured the American imagination in recent years—not least because even the sober scientists of the survey, who come in ever more frequently from their regional offices in Colorado and California and beyond, shake their heads in wonder at the scale of the eruption-in-waiting.
The greatest of the unknowns and the indeterminables, though, is when something might happen. Those who study the sleeping beast in the caldera offer as their safest answer only that it will happen sometime soon, using that word in its strictly geological sense. And that could be a quarter of a million years. After all, the first eruption began 2.1 million years ago, the second 1.3 million years ago, and the third 1.1 million years ago—and this last eruption lasted, if it is possible to imagine such a thing, for 600,000 years. It completed its work 600,000 years ago, confusingly—meaning that there has been no activity for that length of time, a period that is fully (to remind ourselves of the insignificance of mankind) five times as long as recognizable Homo sapiens has been bipedally extant on the face of the planet.
Yellowstone is thus, on purely statistical grounds, ready for an eruption almost any day. And when it happens, it will almost certainly be vast. Yellowstone is already one of the biggest explosive volcanic complexes on the surface of the planet, and when it goes, huge tracts of the western states will be covered with immense thicknesses of volcanic products. But as to what will trigger it—what cosmic whim might begin the ruin of the West—no one can say, or really even imagine.
Except, that is, for one small thing, and an even more alluring mystery. Researchers who were working in the park in the autumn of 2002 noticed that for some unexplained reason a number of the famous geysers that were regularly hurling steam and boiling water high into the air, to the delight of the tourists, were doing so slightly more frequently than usual.
There was no indication that the famous Old Faithful geyser was affected; but a number of others, smaller but equally faithful to the chronometers, did begin to display a puzzling change. The Daisy Geyser in the Upper Geyser Basin, for example, would in September and October that year invariably send its column of blue water shooting into the air every two hours, thirty minutes, and twelve seconds. But in November and December that rate changed: It shot out its water and steam every one hour, fifty-seven minutes, and forty-one seconds. It was an enormous change, and it occurred without any obvious reason.
And the other geysers nearby—the Castle, Plate and Plume Geysers in the Upper Geyser Valley, the Pink geyser in the Lower Geyser Valley, and the Lone Pine Geyser—showed much the same pattern. They suddenly speeded up—and no one had the slightest idea why. Everything seemed to change in early November. There was a flurry of small earthquakes—and the geysers all began to go mad. And for no reason—except, as someone suddenly realized, Alaska.
For it turned out that the timing of the changes all coincided, to the hour, with the arrival of the big surface waves from the Denali Earthquake of November 3—the quake that had prompted my visit to Alaska to see how the pipeline had fared. There was, it seemed, a link.
Even though the two places—the Yellowstone caldera and the Denali Fault—are separated by 1,800 miles of rock and mountain and river and lake, the occurrence of trauma in one place seems to have an effect on the other, as though the whole of western America were ringing like an immense brass bell.
Which brought me back to the premise with which this account began—the notion that this fragile planet, suspended in the blackness of space, is now something to be considered as an immense whole, with all of its elements interlinked and interconnected, the one happenstance triggering another and another and another, for as long as the world exists. The butterfly effect, written into the rocks of the American West, and into the rest of the world as well.
One day the researcher who discovered the effects of Alaskan quakes on Yellowstone geysers took me to see Daisy: Might it signal by its timing some event in the distant West? So I waited with him in the late spring sunshine, looking at my wristwatch, gazing across from behind a pinewood palisade toward the yellow patch of sulfur on the ground, surrounded by pools of blue groundwater. All was quiet.
The minute hand ticked slowly up to twenty-eight, then twenty-nine, and then thirty—and then suddenly, with a brief subterranean burp and a gurgle and a moan and a great whoosh, Daisy exploded, releasing its pure-white fountain of boiling water high up into the chilly air. A hundred children cheered, and a hundred cameras clicked the record. It was all a classic American spectacle.
And it had happened right on time. The geyser erupted just when it was supposed to. Not seconds earlier, not seconds later. My scientist guide looked a little disappointed, I thought. The world beyond Yellowstone was evidently quiet that day. Nothing had happened to change Daisy’s eruptive timetable. Alaska had not had an earthquake, and nor, quite probably, had San Francisco.
But one day each of these places will have an earthquake. There is not a scintilla of doubt about it. Earthquakes, volcanoes, tsunamis are all as inevitable a part of the earth’s story as sunrise and sunset are a part of the quotidian routine, the only signal difference being the rhythm and the pitiless i
rregularity of their occurrence. One day this place of mountains, lakes, and rivers will break and explode. New volcanoes will thrust out of the earth, produce even more rock, and lay it down on top of whatever monuments humans may have imprudently chosen to place in the way.
All that humans do, and everywhere that humans inhabit, is for the moment only—like the cherry blossoms in a Japanese springtime that are exquisite simply by virtue of their very impermanence. Geology, particularly the dramatic New Geology one sees in a place like Yellowstone, or on the Denali, or on the great San Andreas Fault, serves as an ever-present reminder of this—of the fragility of humankind, the evanescent nature of even our most impressive achievements.
It serves as a reminder that it is only by the planet’s consent that places like the mountains of Montana and Wyoming exist, and only by the planet’s consent that all towns and all cities—New Madrid, Charleston, Anchorage, Banda Aceh—and San Francisco—survive for as long as they do. It is a reminder, too, that this consent is a privilege, and one that may be snatched away suddenly, and without any warning at all.
APPENDIX
On Taking an Earthquake’s Measure
MOST OF THE PHYSICAL PHENOMENA THAT AFFECT OR afflict the planet can be measured, and thus recorded numerically. Temperature, for example, can be measured by a thermometer of some kind, and noted in degrees Fahrenheit or Celsius. Atmospheric pressure’s effects on a closed cylinder can similarly be related in millibars or the more modern units of hectopascals. Wind speed can be measured in knots or miles per hour, or, in cases where the wind affects the sea, by employing the famous scale that was first adopted by Admiral Sir Francis Beaufort* at the beginning of the nineteenth century. Tsunami waves have measurable heights and speeds. And even the eruption of a volcano can be assigned an index, although it is one determined with some difficulty: The amount of material estimated to have been expelled from the crater is multiplied by the height to which it is thrown up into the air.
ON INTENSITY
But earthquakes cannot be tidily transformed into numbers. They are so huge, unwieldy, and unpredictable that for most of human history they could only be suffered, defying classification by any kind of standard numerical arrangement. Instead, in the days before seismographs were invented, the only way of quantifying a quake was to do so subjectively, by looking at the damage it had caused, asking people exactly how they had experienced its shocks, and then assigning to it a number that described what by common consent came to be termed the quake’s intensity. This figure would very obviously vary—the intensity would be greatest near the earthquake’s epicenter, and it would diminish the farther an observer was away from it. The maximum intensity would provide some approximate measure of how big the event was when compared with another—but it would be a subjective assessment only and, moreover, one that required people who had been on hand, who had been affected and were fully aware, and who were intelligent enough to report. An earthquake that happened in an unpopulated area would pass unrecorded and unmeasured.
The first systematic investigation of intensity was probably the one conducted by Robert Mallet, the British engineer who first came up with the term seismograph in 1854. In late December 1857 he applied to the Royal Society for a travel grant to allow him to study in detail the great earthquake that had just occurred in Naples, and he spent two months there systematically observing what had evidently been a very major seismic event. Eventually he was able to draw maps showing the areas of the earthquake’s greatest evident intensity, with lines—which he called isoseismals—connecting places that had suffered the same amount of damage, or where people or animals had experienced the same amount of shaking. From his map he was able to infer the center of the earthquake’s shaking, and was also able to show how the intensity diminished with distance from this center. Once he compared his survey with the crude information that he was able to glean about other earthquakes, he was able to assign an approximate relative size to this Italian event: There was no possibility of giving the event an absolute size, but to be able to say whether it was greater or less great than another event in another part of the world did have some utility and purpose. His approach was, in consequence, seized upon by the earthquake scientists of the day as the best means available to determine a quake’s size—and over the years a number of these scientists examined data in more detail and developed formal scales of intensity, some of them still in use today.
The first of these intensity scales was developed in the 1880s by the Swiss scientists M. S. de Rossi and François Forel: It assigned ten values to earthquake intensities, which were numbered between I and X (Roman numerals were the most popular convention for intensity scales, and they are still in general use today). In the Rossi-Forel scheme the number I signified a slight quake, one rather confusingly defined as having been recorded by a single seismograph or by some seismographs of the same pattern, but not by several seismographs of different kinds; and with the shock felt also by an experienced observer. At the other end of the scale the number “X” was used to designate a seismic happening that was marked by great disasters, ruins, disturbance of strata, fissures in the earth’s crust, rock falls from mountains. Between I and X were all the more common earthquakes that wrought more or less damage and occasioned more or less havoc or casualties—and so a quake with intensity VIII was very bad, while an event of recorded intensity III was just this side of bearable.
Despite this vague and somewhat ambiguous calibration, the Rossi-Forel Scale became in short order enormously popular among seismologists and, perhaps more important, with the public, which found it easy to appreciate. It was in consequence used to describe most of the major seismic events that occurred in the Western world during the last quarter of the nineteenth century.
Before long, however, it was seen that the Rossi–Forel Scale had some important shortcomings. Engineers reported that the damage sustained by a city’s buildings—whose collapses and other sufferings were used as prime indicators of the intensity of the earthquake that had affected them—was determined not only by the strength of an earthquake, but also by how the buildings had been constructed in the first place (which meant that Japanese earthquakes, for instance, caused very different kinds of damage to the local wood-and-paper houses—a realization that caused a whole new scale, the seven-step Omori Scale, to be invented to deal with that problem. The scale is still used today, but only in Japan).
Moreover, it also turned out that earthquake waves propagate themselves in different ways, at different rates, and with different consequences, according to the nature of the rocks through which they pass. Intensity scales were thus rapidly understood to be not simply subjective and ambiguous but also local. Any one earthquake might display a number of very different apparent intensities in locations that were just a few hundred yards apart, since buildings might react (collapsing or not as the case might be) first, according to whether they were built of brick, masonry, iron, paper, mud, felt, or wattle, and second, according to whether they had been constructed on a base made up of mud or granite, schist or shale.
So this first, very basic intensity scale had to be modified and modified as more and more information about more and more variables poured in. The Italian seismologist and vulcanologist Giuseppe Mercalli came up with what he felt was a more accurate ten-step scale in 1902; but it was soon found wanting in its own way and modified by a man named August Sieberg who expanded it into a twelve-step scale—I to XII—which was to become (under the name the Mercalli-Cancani-Sieberg Scale, or MCS) the basis of all modern measurings, particularly in southern Europe. Two Americans then translated this scale into English in the 1930s and called it the Modified Mercalli Intensity Scale, or MMI, versions of which were popular around most of the Western world for many years.
It became ever more complicated from this point on. In 1956 Charles Richter—the nudist, vegetarian, womanizing, Asperger’s syndrome–afflicted seismologist from CalTech, who long before had created an entirel
y different and much more widely used scale of his own, as we shall see—weighed in and fine-tuned the MCS to become the “Modified Mercalli Scale of 1956,” which is generally known as the MM56. And even though this scale is thought to be perfectly good and accurate (at least for most of the world beyond Japan), the process of refinement and universalization went on and on and on. A supposedly ultrasophisticated and more “powerful” scale known as the Medvedev-Sponheuer-Karnik Scale, or MSK64, became popular in the 1960s and 1970s,* and then the most recent intensity scale to be fully agreed on by the world seismological authorities was put forward at an international congress of earthquake scientists held in Zurich in 1990. It was refined at meetings in Potsdam, Munich, and Reykjavik and then finally published in 1998 under the name the European Macroseismic Intensity Scale, or the EMS. And this, it seems universally agreed, is to be the current gold standard.
For though the EMS is made to look rather complicated—because it includes the classification of seismically affected buildings into different categories of construction—it is now fully internationally agreed to and, for the time being, absolute. It represents the world consensus (aside from Japan, which sticks stubbornly to its very outdated Omori Scale) on the best way to calibrate earthquakes on a human scale. It succeeds all the others, including the time-honored Rossi-Forel, the elegant Mercalli, and the Mercalli’s various offspring, such as the MM56, the MSK64 and the MSK81 scales.
In the table that follows, which presents in summary what is now formally known as the EMS-98 Intensity Scale, there are a number of variables. The designation (a), for example, represents the earthquake’s effects on humans, while (b) represents the same event’s effects on “objects and nature” and (c) its effects on buildings. The five categories of buildings, listed as A to F, represent, on the other hand, a scale of construction vulnerability—one that runs from buildings made of unreinforced masonry (which is very vulnerable) to those built using highly earthquake-resistant steel modules that by law have to be erected in earthquake-prone cities. The damage scale of 1 to 5 indicates differing degrees of effect—from slight cracking to total destruction. It is complex but in its essence is self-explanatory. The EMS-98 Scale, then, reads as follows: