Page 11 of Billions & Billions


  Now, though, the governments and peoples of the Earth are gradually becoming aware of yet another dangerous consequence of the burning of fossil fuels: If I burn a piece of coal or a gallon of petroleum or a cubic foot of natural gas, I’m combining the carbon in the fossil fuel with the oxygen in the air. This chemical reaction releases energy locked away for perhaps 200 million years. But in combining a carbon atom, C, with an oxygen molecule, O2, I also synthesize a molecule of carbon dioxide, CO2

  C + O2 → CO2

  And CO2 is a greenhouse gas.

  —

  What determines the average temperature of the Earth, the planetary climate? The amount of heat trickling up from the center of the Earth is negligible compared with the amount falling down on the Earth’s surface from the Sun. Indeed, if the Sun were turned off, the temperature of the Earth would fall so far that the air would freeze solid, and the planet would be covered with a layer of nitrogen and oxygen snow 10-meters (30-feet) thick. Well, we know how much sunlight is falling on the Earth and warming it. Can’t we calculate what the average temperature of the Earth’s surface ought to be? This is an easy calculation—taught in elementary astronomy and meteorology courses, another example of the power and beauty of quantification.

  The amount of sunlight absorbed by the Earth has to equal on average the amount of energy radiated back to space. We don’t ordinarily think of the Earth as radiating into space, and when we fly over it at night we don’t see it glowing in the dark (except for cities). But that’s because we’re looking in ordinary visible light, the kind to which our eyes are sensitive. If we were to look beyond red light into what’s called the thermal infrared part of the spectrum—at 20 times the wavelength of yellow light, for example—we would see the Earth glowing in its own eerie, cool infrared light, more in the Sahara than Antarctica, more in daytime than at night. This is not sunlight reflected off the Earth, but the planet’s own body heat. The more energy coming in from the Sun, the more the Earth radiates back to space. The hotter the Earth, the more it glows in the dark.

  What’s coming in to warm the Earth depends on how bright the Sun is and how reflective the Earth is. (Whatever isn’t reflected back into space is absorbed by the ground, the clouds, and the air. If the Earth were perfectly shiny and reflective, the sunlight falling on it wouldn’t warm it up at all.) The reflected sunlight, of course, is mainly in the visible part of the spectrum. So set the input (which depends on how much sunlight the Earth absorbs) equal to the output (which depends on the temperature of the Earth), balance the two sides of the equation, and out comes the predicted temperature of the Earth. A cinch! Couldn’t be easier! You calculate it, and what’s the answer?

  Our calculation tells us that the average temperature of the Earth should be about 20° Celsius below the freezing point of water. The oceans ought to be blocks of ice and we all ought to be frozen stiff. The Earth should be inhospitable to almost all forms of life. What’s wrong with the calculation? Did we make a mistake?

  We didn’t exactly make a mistake in the calculation. We just left something out: the greenhouse effect. We implicitly assumed that the Earth had no atmosphere. While the air is transparent at ordinary visible wavelengths (except for places like Denver and Los Angeles), it’s much more opaque in the thermal infrared part of the spectrum, where the Earth likes to radiate to space. And that makes all the difference in the world. Some of the gases in the air in front of us—carbon dioxide, water vapor, some oxides of nitrogen, methane, chlorofluorocarbons—happen to absorb strongly in the infrared, even though they are completely transparent in the visible. If you put a layer of this stuff above the surface of the Earth, the sunlight still gets in. But when the surface tries to radiate back to space, the way is impeded by this blanket of infrared absorbing gases. It’s transparent in the visible, semi-opaque in the infrared. As a result the Earth has to warm up some, to achieve the equilibrium between the sunlight coming in and the infrared radiation emitted out. If you calculate how opaque these gases are in the infrared, how much of the Earth’s body heat they intercept, you come out with the right answer. You find that on average—averaged over seasons, latitude, and time of day—the Earth’s surface must be some 13°C above zero. This is why the oceans don’t freeze, why the climate is congenial for our species and our civilization.

  Our lives depend on a delicate balance of invisible gases that are minor components of the Earth’s atmosphere. A little greenhouse effect is a good thing. But if you add more greenhouse gases—as we have been doing since the beginning of the Industrial Revolution—you absorb more infrared radiation. You make that blanket thicker. You warm the Earth further.

  For the public and policymakers, all this may seem a little abstract—invisible gases, infrared blankets, calculations by physicists. If difficult decisions on spending money are to be made, don’t we need a little more evidence that there really is a greenhouse effect and that too much of it can be dangerous? Nature has kindly provided, in the character of the nearest planet, a cautionary reminder. The planet Venus is a little closer to the Sun than the Earth, but its unbroken clouds are so bright that the planet actually absorbs less sunlight than the Earth. Greenhouse effect aside, its surface ought to be cooler than the Earth’s. It has very closely the same size and mass as the Earth, and from all this we might naively conclude that it has a pleasant Earth-like environment, ultimately suitable for tourism. However, if you were to send a spacecraft through the clouds—made, by the way, largely of sulfuric acid—as the Soviet Union did in its pioneering Venera series of exploratory spacecraft, you would discover an extremely dense atmosphere made largely of carbon dioxide with a pressure at the surface 90 times what it is on Earth. If now you stick out a thermometer, as the Venera spacecraft did, you find that the temperature is some 470°C (about 900°F)—hot enough to melt tin or lead. The surface temperatures, hotter than those in the hottest household oven, are due to the greenhouse effect, largely caused by the massive carbon dioxide atmosphere. (There are also small quantities of water vapor and other infrared absorbing gases.) Venus is a practical demonstration that an increase in the abundance of greenhouse gases may have unpleasant consequences. It is a good place to point ideologically driven radio talk-show hosts who insist that the greenhouse effect is a “hoax.”

  As there get to be more and more humans on Earth, and as our technological powers grow still greater, we are pumping more and more infrared absorbing gases into the atmosphere. There are natural mechanisms that take these gases out of the air, but we are producing them at such a rate that we are overwhelming the removal mechanisms. Between the burning of fossil fuels and the destruction of forests (trees remove CO2 and convert it to wood), we humans are responsible for putting about 7 billion tons of carbon dioxide into the air every year.

  You can see in the figure on this page the increase with time of carbon dioxide in the Earth’s atmosphere. The data come from the Mauna Loa atmospheric observatory in Hawaii. Hawaii is not highly industrialized and is not a place where extensive forests are being burned down (putting more CO2 in the air). The increase in carbon dioxide with time detected over Hawaii comes from activities all over the Earth. The carbon dioxide is simply carried by the general circulation of the atmosphere worldwide—including over Hawaii. You can see that every year there’s a rise and fall of carbon dioxide. That’s due to deciduous trees, which, in summer, when in leaf, take CO2 out of the atmosphere, but in winter, when leafless, do not. But superimposed on that annual oscillation is a long-term increasing trend, which is absolutely unambiguous. The CO2 mixing ratio has now exceeded 350 parts per million—higher than it’s ever been during the tenure of humans on Earth. Chlorofluorocarbon increases have been the quickest—by about 5 percent a year—because of the worldwide growth of the CFC industry, but they are now beginning to taper off.* Other greenhouse gases, methane for instance, are also building up because of our agriculture and our industry.

  Well, if we know by how much greenhouse gases are bu
ilding up in the atmosphere and we claim to understand what the resulting infrared opacity is, shouldn’t we be able to calculate the increase of temperature in recent decades as a consequence of the buildup of CO2 and other gases? Yes, we can. But we have to be careful. We must remember that the Sun goes through an 11-year cycle, and that how much energy it puts out changes a little over its cycle. We must remember that volcanos occasionally blow their tops and inject fine sulfuric acid droplets into the stratosphere, thereby reflecting more sunlight back into space and cooling the Earth a little. A major explosion can, it is calculated, lower the world temperature by nearly a Celsius degree for a few years. We must remember that in the lower atmosphere there is a pall of tiny sulfur-containing particles from industrial smokestack pollution that—however damaging to people on the ground—also cools the Earth; as well as windblown mineral dust from disturbed soils that has a similar effect. If you make allowances for these factors and many more, if you do the best job climatologists are now capable of doing, you reach this conclusion: Over the twentieth century, due to the burning of fossil fuels, the average temperature of the Earth should have increased by a few tenths of a degree Celsius.

  Naturally you would like to compare this prediction with the facts. Has the Earth’s temperature increased at all, especially by this amount, during the twentieth century? Here again you must be careful. You must use temperature measurements made far from cities, because cities, through their industry, and their relative lack of vegetation, are actually hotter than the surrounding countryside. You must properly average out measurements made at different latitudes, altitudes, seasons, and times of day. You must allow for the difference between measurements on land and measurements on water. But when you do all this, the results seem consistent with the theoretical expectation.

  The Earth’s temperature has increased a little, less than a degree Celsius, in the twentieth century. There are substantial wiggles in the curves, noise in the global climatic signal. The ten hottest years since 1860 have all occurred in the ’80s and early ’90s—despite the cooling of the Earth from the 1991 explosion of the Philippine volcano Mount Pinatubo. Mount Pinatubo introduced 20 to 30 megatons of sulfur dioxide and aerosols into the Earth’s atmosphere. Those materials completely circled the Earth in about three months. After only two months they had covered about two-fifths of the Earth’s surface. This was the second most violent volcanic eruption in this century (second only to that of Mount Katmai in Alaska in 1912). If the calculations are right and there are no more big volcanic explosions in the near future, by the end of the ’90s the upward trend should reassert itself. It has: 1995 was marginally the hottest year on record.

  Another way to check whether the climatologists know what they’re doing is to ask them to make predictions retrospectively. The Earth has gone through ice ages. There are ways of measuring how the temperature fluctuated in the past. Can they predict (or, better, postdict) the climates of the past?

  Important findings on the climate history of the Earth have emerged by studying cores of ice cut and extracted from the Greenland and Antarctic ice caps. The technology for these borings comes straight from the petroleum industry; in this way, those responsible for extracting fossil fuels from the Earth have made an important contribution to clarifying the dangers of so doing. Minute physical and chemical examination of these cores reveals that the temperature of the Earth and the abundance of CO2 in its atmosphere go up and down together—the more CO2, the warmer the Earth. The same computer models used to understand the global temperature trends of the last few decades correctly postdict ice age climate from fluctuations in greenhouse gases in earlier times. (Of course no one is saying that there were pre–ice age civilizations that drove fuel-inefficient cars and poured enormous quantities of greenhouse gases into the atmosphere. Some variation in the amount of CO2 happens naturally.)

  In the last few hundred thousand years, the Earth has gone into and emerged out of several ice ages. Twenty thousand years ago, the city of Chicago was under a mile of ice. Today we are between ice ages, in what’s called an interglacial interval. The typical temperature difference for the whole world between an ice age and an interglacial interval is only 3° to 6°C (equivalent to a temperature difference of 5° to 11°F). This should immediately set alarm bells ringing: A temperature change of only a few degrees can be serious business.

  With this experience under their belts, this calibration of their abilities, climatologists can now try to predict just what the future climate of the Earth may be like if we keep on burning fossil fuels, if we continue to pour greenhouse gases into the atmosphere at a frenetic pace. Various scientific groups—modern equivalents of the Delphic Oracle—have employed computer models to calculate what the temperature increase ought to be, predicting how much the world temperature increases if, say, the amount of carbon dioxide in the atmosphere doubles, which it will (at the present rate of burning fossil fuels) by the end of the twenty-first century. The chief oracles are the Geophysical Fluid Dynamics Laboratory of the National Oceanic and Atmospheric Administration (NOAA) at Princeton; the Goddard Institute of Space Studies of NASA in New York; the National Center for Atmospheric Research in Boulder, Colorado; the Department of Energy’s Lawrence Livermore National Laboratory in California; Oregon State University; the Hadley Center for Climate Prediction and Research in the United Kingdom; and the Max Planck Institute for Meteorology in Hamburg. They all predict that the average temperature increase will be between about 1° and 4°C. (In Fahrenheit it’s about twice that.)

  This is faster than any climate change observed since the rise of civilization. At the low end, developed, industrial societies, at least, might be able with a little struggle to adjust to the changed circumstances. At the high end, the climatic map of the Earth would be dramatically changed, and the consequences, both for rich and poor nations, might be catastrophic. Over much of the planet, we have confined forests and wildlife to isolated, noncontiguous areas. They will be unable to move as the climate changes. Species extinctions will be greatly accelerated. Major transplanting of crops and people will become necessary.

  None of the groups claims that doubling the carbon dioxide content of the atmosphere will cool the Earth. None claims that it will heat the Earth by tens or hundreds of degrees. We have an opportunity denied to many ancient Greeks—we can go to a number of oracles and compare prophecies. When we do so, they all say more or less the same thing. The answers in fact are in good accord with the most ancient oracles on the subject—including the Swedish Nobel Prize—winning chemist Svante Arrhenius, who around the turn of the century made a similar prediction using, of course, much less sophisticated knowledge of the infrared absorption of carbon dioxide and the properties of the Earth’s atmosphere. The physics used by all these groups correctly predicts the present temperature of the Earth, as well as the greenhouse effects on other planets such as Venus. Of course, there may be some simple error that everyone has missed. But surely these concordant prophecies deserve to be taken very seriously.

  There are other disquieting signs. Norwegian researchers report a decrease in the extent of Arctic ice cover since 1978. Enormous rifts in the Wordie Ice Sheet in Antarctica have been evident over the same period. In January 1995, a 4,200 square kilometer piece of the Larsen Ice Shelf broke away into the Antarctic Ocean. There has been a notable retreat of mountain glaciers everywhere on Earth. Extremes of weather are increasing in many parts of the world. Sea level is continuing to rise. None of these trends by itself is compelling proof that the activities of our civilization rather than natural variability is responsible. But together, they are very worrisome.

  Increasing numbers of climate experts have recently concluded that the “signature” of man-made global warming has been detected. Representatives of the 25,000 scientists of the Intergovernmental Panel on Climate Change, after an exhaustive study, concluded in 1995 that “the balance of evidence suggests there is a discernible human influence on climate.” While no
t yet “beyond all reasonable doubt,” says Michael MacCracken, director of the U.S. Global Change Research Program, the evidence “is becoming quite compelling.” The observed warming “is unlikely to be caused by natural variability,” says Thomas Karl of the U.S. National Climatic Data Center. “There’s a 90 to 95 percent chance that we’re not being fooled.”

  In the following sketch is a very broad perspective. At the left, it’s 150,000 years ago; we have stone axes and are really pleased with ourselves for having domesticated fire. The global temperatures vary with time between deep ice ages and interglacial periods. The total amplitude of the fluctuations, from the coldest to the warmest, is about 5°C (almost 10°F). So, the curve wiggles along, and after the end of the last ice age, we have bows and arrows, domesticated animals, the origin of agriculture, sedentary life, metallic weapons, cities, police forces, taxes, exponential population growth, the Industrial Revolution, and nuclear weapons (all that last part is invented just at the extreme right of the solid curve). Then we come to the present, the end of the solid line. The dashed lines show some projections of what we’re in for because of greenhouse warming. This figure makes it quite clear that the temperatures we have now (or are shortly to have if present trends continue) are not just the warmest in the last century, but the warmest in the last 150,000 years. That’s another measure of the magnitude of the global changes we humans are generating, and their unprecedented nature.

  Global warming does not by itself make bad weather. But it does heighten the chances of having bad weather. Bad weather certainly does not require global warming, but all computer models show that global warming should be accompanied by significant increases in bad weather—severe drought inland, severe storm systems and flooding near the coasts, both much hotter and much colder weather locally, all driven by a relatively modest increment in the average planetary temperature. This is why extreme cold weather in, say, Detroit in January is not the telling refutation of global warming that some newspaper editorial pages pretend. Bad weather can be very expensive. To take a single example, the American insurance industry alone suffered a net loss of some $50 billion in the wake of a single hurricane (Andrew) in 1992, and that’s only a small fraction of the total 1992 losses. Natural disasters cost the United States over $100 billion a year. The world total is much larger.