At the time, the most prominent chemist in Germany was Justus von Liebig (1803–1873). Ambitious, charismatic, and contentious, Liebig was an educational innovator, a scientific visionary, and a conniving fraud—he faked his doctoral dissertation and, apparently, entire lines of experiments. Nonetheless, his reputation was such that in 1837 the British Association for the Advancement of Science commissioned Liebig, rather than any British scientist, to write a report on the state of organic chemistry. Three years later the great man returned with a report on a somewhat different topic: agricultural chemistry, a subject he had never previously researched. His conclusions were strikingly similar to Sprengel’s, though he mentioned the other man’s earlier research only in passing, dismissively. Sprengel complained, but Liebig’s celebrity ensured that he received the credit for Sprengel’s ideas. This turn of events was particularly unjust for what has become known, incorrectly, as Liebig’s Law of the Minimum: plants need many nutrients, but their growth rate is limited by the one least present in the soil.

  In most cases, that nutrient is nitrogen. At first blush, the notion of nitrogen being a limit seems odd; there is more nitrogen in the world than carbon, oxygen, phosphorus, and sulfur combined. Unfortunately, more than 99 percent of that nitrogen is nitrogen gas. Nitrogen gas—N2 in chemical notation—consists of two nitrogen atoms bound together so tightly that plants cannot split them apart for use. Instead, plants are able to absorb nitrogen only when it is in chemical combinations—“fixed,” as scientists say—that are easier to break up.

  In the soil, nitrogen is mainly fixed by microorganisms. Some break down organic matter, making its nitrogen available again; others, such as the symbiotic bacteria that live around the roots of beans, clover, lentils, and other legumes, directly fix nitrogen gas into compounds plants can take in. (A small amount is fixed by lightning, which zaps apart nitrogen molecules in the air, after which they combine with oxygen into compounds that dissolve in rainwater.) When farmers put additives like ashes, blood, urine, compost, and animal feces in their fields, they are providing fodder for nitrogen-fixing soil microorganisms; when they grow legumes, it increases the supply of nitrogen-fixing bacteria. The implication of Liebig’s work was that dumping artificially created nitrogen compounds—chemical fertilizers—into fields would do the same thing.

  Justus von Liebig in a portrait from 1846 Credit 33

  Smarmy but far-sighted, Liebig envisioned a new kind of farming: agriculture as a branch of chemistry and physics. In this scheme, soil was just a base with the physical attributes necessary to hold roots. What mattered to agriculture were the chemical nutrients on which plant growth depended: nitrogen, potassium, phosphorus, calcium, and so on. If farmers wished, they could plant seeds in soil with zero humus (expanses of sand, perhaps), sprinkle the seeds with the doses of water and chemicals prescribed by experts, and the seeds would germinate and grow. A farm would be an organic machine, in the phrase of the historian Richard White; the new agriculture, precise as a clock, would be a harnessed flow of energy and matter. Vis vitalis, living humus, and pneuma—not only were they irrelevant, they did not exist. Crops and soil were brute physical matter, collections of molecules to be optimized by chemical recipes, rather than flowing, energy-charged wholes. In today’s terms, Liebig was taking the first steps toward industrial agriculture regulated by farm chemicals—an early version of Wizardly thought.*1

  At the time, the biggest known fertilizer source was Peruvian guano. As demand drove up prices and reduced supplies, attention turned to sodium nitrate. Sodium nitrate (NaNO3) consists of a sodium atom, a nitrogen atom, and three oxygen atoms, all bound together loosely enough for plants to assimilate the nitrogen. The world’s biggest nitrate deposits are in the high desert of northern Chile. Although it almost never rains there, the area is constantly bathed in a fine spray from the Pacific Ocean. The spray is very thin—less than an inch per year—but it contains the nutrients from the Humboldt Current that feed the anchovetas that feed the cormorants. Other nutrients fall from the sky as dust or well up from groundwater. With next to no rainfall to wash away the residue, the deposits build up over time. The result: a layer of naturally deposited fertilizer four hundred miles long, twelve miles wide, and up to nine feet deep. It was eagerly exploited. Nitrates from Chile became a principal ingredient in packaged fertilizers—and, alas, bombs. By the beginning of the twentieth century, as Vaclav Smil of the University of Manitoba has written in his history of nitrogen use, almost half the nitrates shipped to the United States were used to make explosives.

  In 1898 the British chemist William Crookes rang an alarm: the nitrogen would run out. Crookes was the new president of the British Association for the Advancement of Science, the group that had commissioned Liebig’s report. In his inaugural address, Crookes focused, quite literally, on Europe’s daily bread. The “bread-eaters of the world,” as he called them, were increasing by more than 6 million a year. To feed these new bread eaters, farmers either would have to expand into unused land or produce more from their existing land by fertilizing it more heavily. Neither course was possible, Crookes thought. Most suitable land was already under the plow. And increasing the demand for fertilizer would exhaust the supply of Peruvian guano and Chilean nitrates in “a few years.” By the 1930s, Crookes predicted, the world’s wheat supply “will fall so far short of the demand as to constitute general scarcity.” Science, Crookes hoped, would somehow save the bread eaters.

  He got his wish. Science did save the day—at least for a while.

  The Story of N (Synthetic Version)

  Six years after Crookes issued his warning, an Austrian chemical company asked the German chemist Fritz Haber to look into synthetic fertilizer. More precisely, the Austrians asked Haber to look into synthetic ammonia. For decades researchers had believed that if they could manufacture ammonia it could be used as the basis for a synthetic fertilizer—something made in a factory instead of being dug out of the ground and shipped across the ocean. Chemically speaking, ammonia (NH3) is simple: three hydrogen atoms, one nitrogen atom, arranged in a rough pyramid. Both hydrogen and nitrogen normally exist as gases, H2 and N2. In theory, one should be able to split gaseous nitrogen and hydrogen into single N and H atoms, then put together the separate atoms into NH3 like so many building blocks. Scientists had figured out how to split the hydrogen. But, like plants, they had been unable to pry apart N2. Every attempt to make synthetic ammonia had failed.

  “Failed” in this case meant “failed to come up with something that industry could use.” Chemists actually had synthesized ammonia, but only at ultra-high temperatures and pressures in costly laboratory experiments. And even in these extreme circumstances the reaction needed a catalyst, a substance that facilitates a chemical reaction but is not itself affected by it. Catalysts are like jaywalking pedestrians who cause car accidents but walk away from them without being affected. But unlike the disruptive pedestrians, catalysts are essential to the smooth functioning of thousands of chemical processes.

  Several metals served as catalysts for ammonia. In the right conditions, the metal adsorbs hydrogen and nitrogen gases, dissociating them into separate hydrogen and nitrogen atoms. Now unattached, the nitrogen atoms can easily bond with hydrogen atoms, creating ammonia molecules. Some of the energy from the newly formed N-H bonds helps the ammonia molecule leave (“desorb,” in the jargon) the metal surface and float into the air. The metal is left unchanged.

  Haber tried this himself. Like his predecessors, he found that blowing hot, high-pressure nitrogen and hydrogen gas over metals like iron, manganese, and nickel produced tiny but measurable amounts of ammonia—it converted about one-hundredth of 1 percent of the original gases. By repeatedly recirculating the hydrogen and nitrogen, Haber could very slowly fix most of the nitrogen into ammonia. But, as he told the Austrians, the process was too arduous to justify the cost. It would be like spending millions of dollars to build an orange-juice factory that produced a teaspoon of juice a day.
br />
  Fritz Haber (right) supervises a laboratory assistant in a newspaper photograph from 1918, the year he won his Nobel Prize. Credit 34

  Soon after, luck entered the picture in the form of Walther Nernst, a brilliant but caustic physicist. Nernst’s experiments on the effects of heat in chemical reactions led him to conclude in 1907 that Haber’s estimates of ammonia production were much too high. Even though Haber thought his figures were pessimistic, they overestimated ammonia production by almost a factor of four. Haber repeated his earlier tests and this time obtained results that were close to Nernst’s claims. Chagrined, Haber acknowledged his error. In a fit of pettiness, Nernst publicly derided Haber’s “strongly inaccurate” work, which Nernst said had led him to believe “that it might be possible to synthesize ammonia from nitrogen and hydrogen.” As Smil, the nitrogen historian, has pointed out, this was nonsense: Haber had concluded that fixing nitrogen was not feasible, the opposite of Nernst’s suggestion. Still, the incident was humiliating.

  Now determined to redeem himself, Haber returned to ammonia. But now he had acquired a new ally: Badische Anilin- und Soda-Fabrik (BASF), the world’s biggest chemical firm. The basic problem was that ammonia was best synthesized when the hydrogen and nitrogen were at high temperature and high pressure, but those very conditions pushed up the procedure’s cost and difficulty. BASF helped Haber build a better high-pressure apparatus and search for better catalysts. A breakthrough came on July 2, 1909, when, for five hours straight, Haber pumped hot gas into the apparatus and “produced continuously liquid ammonia.”

  Haber’s experimental model was just two and a half feet tall, too small for commercial production. And his catalysts, osmium and uranium, were commercially unsuitable: the total global supply of osmium was less than 250 pounds and uranium was dangerous—not just because it is radioactive, but because it reacts explosively with oxygen and water. Nonetheless, he had demonstrated that it was possible to synthesize ammonia at high volume.

  BASF put a chemical engineer named Carl Bosch in charge of scaling up Haber’s process and finding more affordable catalysts. Bosch, too, had spent years trying to fix nitrogen. When he learned that Haber had beaten him to the punch, he told the company without regret that it should immediately develop his rival’s design. Building the requisite high-pressure tanks proved to be especially difficult, because—an unhappy surprise to Bosch—the hydrogen diffusing into the walls combined with the carbon in the steel, weakening the metal. Meanwhile, Bosch set up a team that tested thousands of compounds to find a better catalyst. The best proved to be iron with a little aluminum, calcium, and magnesium. By 1913 BASF’s first big ammonia plant was running.

  Five years later Haber received a Nobel Prize for synthesizing ammonia; Bosch and his main assistant received a Nobel in 1931, for developing “chemical high pressure methods.” Ammonia synthesis remained so costly that artificial fertilizers did not truly become common until the 1930s. Nonetheless, the Nobels were richly deserved; the Haber-Bosch process, as it is called, was arguably the most consequential technological development of the twentieth century, and one of the more important human discoveries of any time. The Haber-Bosch process has literally changed the land and sky, reshaped the oceans, and powerfully affected the fortunes of humanity. The German physicist Max von Laue put it neatly: Haber and Bosch made it possible to “win bread from air.”

  Carl Bosch (left) receives the 1931 Nobel Prize in Chemistry from Crown Prince Gustav of Sweden. Credit 35

  Today the Haber-Bosch process is responsible for almost all of the world’s synthetic fertilizer. A little more than 1 percent of the world’s industrial energy is devoted to it, as the futurist Ramez Naam has noted. Remarkable fact: “That 1 percent,” Naam says, “roughly doubles the amount of food the world can grow.” Between 1960 and 2000 global synthetic fertilizer use rose by about 800 percent. About half of that production was devoted to just three crops: wheat, rice, and maize. One way to look at this figure is to say that the accomplishment of Borlaug and his associates was to create strains of wheat, rice, and maize that could use what Haber and Bosch had provided.

  Increasing the food supply has led to a concomitant increase in human numbers. Vaclav Smil has calculated that fertilizer from the Haber-Bosch process was responsible for “the prevailing diets of nearly 45% of the world’s population.” Roughly speaking, this is equivalent to feeding about 3.25 billion people. More than 3 billion men, women, and children—an incomprehensibly vast cloud of dreams, fears, and explorations—owe their existence to two early-twentieth-century German chemists.

  The magnitude of the change wrought by artificially fixed nitrogen is hard to grasp. Think of the deaths from hunger that have been averted, the opportunities granted to people who would otherwise not have had a chance to thrive, the great works of art and science created by those who would have had to devote their lives to wringing sustenance from the earth. Particle accelerators in Japan, Switzerland, and Illinois; One Hundred Years of Solitude and Things Fall Apart; vaccines, computers, and antibiotics; the Sydney Opera House and Stephen Holl’s Chapel of St. Ignatius—how many are owed, indirectly, to Haber and Bosch? How many would exist if this Wizardly triumph had not produced the nitrogen that filled their creators’ childhood plates?

  Hard on the heels of the gains were the losses. About 40 percent of the fertilizer applied in the last sixty years wasn’t assimilated by plants; instead, it washed away into rivers or seeped into the air in the form of nitrous oxide. Fertilizer flushed into rivers, lakes, and oceans is still fertilizer: it boosts the growth of algae, weeds, and other aquatic organisms. When these die, they rain to the ocean floor, where they are consumed by microbes. So rapidly do the microbes grow on the increased food supply that their respiration drains the oxygen from the lower depths, killing off most life. Where agricultural runoff flows, dead zones flourish. Nitrogen from Middle Western farms flows down the Mississippi to the Gulf of Mexico every summer, creating an oxygen desert that in 2016 covered almost 7,000 square miles. The next year a still larger dead zone—23,000 square miles—was mapped in the Bay of Bengal.

  Fueling the fire are automobile engines, which as a by-product of combustion convert nitrogen gas to various types of nitrous oxides (NOx, in the language of chemists). Rising into the stratosphere, nitrous oxides combine chemically with the planet’s protective ozone, which guards life on the surface by blocking harmful ultraviolet rays. Mixing with nitrous oxide pulls ozone off duty. Down below, NOx leads to pollution. The total cost of unwanted nitrogen has been estimated at hundreds of billions of dollars a year. Were it not for climate change, suggests the science writer Oliver Morton, the spread of nitrogen’s empire would be our biggest ecological worry.*2

  Law of Return

  Action brings reaction, every yes from one followed by another’s no. Justus von Liebig’s proto-Wizardly vision of industrial agriculture, the farm as organic machine, generated a proto-Prophetic attempt to bring back the living spirit he had banished. The counterforce saw each achievement touted by the modernizers as a deficit, every new landmark as a ruin. Had they been able to read Warren Weaver’s manifesto, they would have rejected it from the first page: agriculture was about more than “usable energy.” In their view, the modernizers had forgotten about the living humus—pneuma, so to speak. On many levels this was a dreadful mistake, the counterforce believed, one that would reverberate through time and space.

  “It is notoriously difficult to identify precisely the beginning of a cultural movement,” wrote the University of Leicester historian Philip Conford. In his history of organic farming, Conford argued that the most appropriate date for the beginning of the pushback against Liebig-style agriculture was the 1920s, when Haber-Bosch fertilizer was beginning to spread across the world. Resistance awoke in Africa, Germany, Great Britain, and the United States. But the most important source was South Asia, where challengers found inspiration in the small, traditional farms that Norman Borlaug would later try to moderniz
e.

  Among the first naysayers was Robert McCarrison, later Major General Sir Robert McCarrison, C.I.E., F.R.C.P. Raised in Northern Ireland, McCarrison joined the military as a surgeon right after obtaining his medical degree. In 1901 he came to what is now Pakistan and was then the northern tip of British India. Twenty-three years old, he had no training in epidemiology, public health, or environmental science. Nonetheless he made contributions to all of these, discovering the environmental causes of diseases (bacteria-carrying insects, vitamin deficiencies) and methods to prevent them. Eventually he became the colonial director of nutritional research, a post he retained until his retirement in 1935.

  In his work as a physician, McCarrison traveled through the high reaches of northern Pakistan, where he encountered the Hunza Valley, inhabited by an Ismaili Muslim people “whose sole food consists to this day of grains, vegetables, and fruits, with a certain amount of milk and butter, and goat’s meat only on feast days.” The Hunza were superb physical specimens: “unsurpassed in perfection of physique and in freedom from disease in general.” In seven years of visits McCarrison “never saw a case of asthenic dyspepsia [chronic stomach distress], or gastric or duodenal ulcer, or appendicitis, of mucous colitis [irritable bowel syndrome], of cancer.” Education and affluence were not the cause of their “extraordinarily long” lifespans; the Hunza were illiterate and so impoverished that most could not afford to keep dogs. McCarrison came to believe that their good health was due to their diet.