Page 4 of The Poison Squad


  The notoriously corrupt Tammany Hall government of New York City resisted reform, but finally, in 1862, passed a city ordinance outlawing swill milk, to little effect. Difficult to enforce even in the city, the new law did nothing to help manage poor dairy practices beyond its boundaries. More than two decades later a study published in the Journal of the American Chemical Society looked at swill milk still being produced just across the Hudson River in New Jersey and found “so numerous a proportion of liquefying colonies [of bacteria] that further counting was discontinued.” A subsequent report in Indiana by that state’s board of health added that a random sampling of milk found “sticks, hairs, insects, blood, pus and filth.”

  Under Wiley the Agriculture Department’s first detailed examination of food products, Foods and Food Adulterants (technical Bulletin no. 13), was published in three parts in 1887. It revealed, as expected, that little had improved with regard to how milk was produced and what it contained. Wiley’s investigating chemists had found a routinely thinned product, dirty and whitened with chalk. It wasn’t just bacteria swimming in the milk. At least one of the samples that Wiley’s crew tested had worms wriggling in the bottom of the bottle. The Division of Chemistry’s findings about other dairy products were more eye opening. Much of the “butter” that the scientists found on the market had nothing to do with dairy products at all except for the fictitious name on the product.

  The ability of producers to so mislead resulted from the work of several French chemists, including one of the nineteenth century’s greatest, Michel Eugène Chevreul. He drew from the Greek word margarites, meaning pearl, and added the Latin for olive, oleum, to coin the term oléomargarine, which is what he called a glossy, whitish, semisolid that two colleagues had derived from olive oil. In 1869 inventor Hippolyte Mège-Mouriès appropriated Chevreul’s terminology and applied it to a butter substitute he made from beef tallow and finely ground animal stomachs. The latter was the basis of a host of butter substitutes embraced by American food processors, which began manufacturing an inventive range of such products in 1876.

  Eager to expand a new market, U.S. innovators competed to improve oleomargarine, seeking patents for variations such as “suine” (from suet) and “lardine” (made from pork fat). The industry especially took off after the powerful meatpacking interests realized the potential for profit from the by-products of slaughterhouses and canneries. Barely had the idea of oleomargarine reached the fast-growing Chicago stockyards when some processors decided that if they added just a dab of actual milk to the product, they might cast off its meaty association. Trying for a more appealing name, meatpackers like the Armour brothers and Gustavus Swift borrowed another term for margarine that was in use in Britain, one that at least sounded dairy based: “butterine.” Other manufacturers didn’t even bother with that terminology; they simply called their oleomargarine “butter.”

  In his 1883 book Life on the Mississippi, Mark Twain recounted overheard comments made by an oleomargarine salesman from Ohio. “You can’t tell it from butter,” the salesman said. “By George, an EXPERT can’t. . . . You are going to see the day, pretty soon, when you can’t find an ounce of butter to bless yourself with, in any hotel in the Mississippi and Ohio Valleys, outside of the biggest cities. Why, we are turning out oleomargarine NOW by the thousands of tons. And we can sell it so dirt-cheap that the whole country has GOT to take it—can’t get around it, you see. Butter don’t stand any show. . . . Butter’s had its DAY.”

  The dairy industry, not surprisingly, disagreed. And furiously. Dairy organizations petitioned members of Congress, demanding action and protection from such deceptive practices. The resulting hearings in both the U.S. Senate and the House of Representatives in 1885 reflected that bitterness, taking up the issue of whether margarine should even be allowed for sale in the United States.

  “We face a new situation in history. Ingenuity, striking hands with cunning trickery, compounds a substance to counterfeit an article of food,” charged U.S. senator Robert La Follette. A Wisconsin Republican, La Follette was firmly in the corner of that state’s numerous dairymen. They objected especially to the practice of coloring oleomargarine to make it look like butter. La Follette conveniently overlooked the fact that butter itself, when produced in the winter from cows fed on hay rather than pasture grass, turns out more white than yellow—and that in addition to diluting and adulterating milk, some dairies routinely added golden coloring to their pale butter. The new, nondairy spreads were nothing better than “counterfeit butter,” the senator charged. Congressman William Grout, Republican of Vermont, went further, dubbing the products “bastard butter.” Without regulation, who knew what might be in the stuff? Grout called it “the mystery of mysteries.”

  Patent applications for margarine listed such ingredients as nitric acid, sulfate of lime, and even sugar of lead. Congressman Charles O’Ferral, a Virginia Democrat, decried the inclusion of bromo-chloralum, a disinfectant also used to treat smallpox. O’Ferral charged that the disinfectant’s purpose in margarine was “to destroy the smell and prevent detection of the putrid mass” of ground-up sheep, cow, and pig stomachs used in many recipes. Lawmakers wanted to know if other leftover bits of dead animals were finding their way into the recipes. “You do not think that you could make good oleomargarine out of a dead cat or dog?” asked Senator James K. Jones, a Democrat from Arkansas, questioning an industry representative. “It has reached the point in the history of the country where the city scavenger butters your bread,” declared Congressman David B. Henderson, an Iowa Republican. Witness L. W. Morton protested. “An ounce of stale fat put into a ton of good fresh fat will spoil the whole,” Morton testified, pointing out that it was common knowledge that butter also went bad.

  The hearings led to the Butter Act of 1886, which passed with support from both parties and was signed by President Cleveland. But thanks to intervention from the meatpackers, the law was less than hard-hitting, imposing a tax of merely two cents a pound on margarine, leaving the imitation still cheaper to produce than the real thing. The law did define butter as “made exclusively from milk or cream” (with the possible addition of salt or dye), meaning that products like butterine had to be labeled “oleomargarine.” False labelers could be fined up to $1,000—assuming they could be caught.

  Members of Wiley’s staff had been witnesses at the hearings, but their findings in the new Bulletin 13 series weren’t issued until the next year, 1887, which made the report an anticlimax of sorts. The studies by the agriculture chemists clearly established, however, that at least a third of what was sold commercially as farm-fresh butter was oleomargarine. The bulletin also noted that thirty-seven American factories were producing more than three million pounds of oleomargarine from animal fats every month. The quality varied widely and there was at least a possibility that some animal parasites could survive the manufacturing process and be present in the spread that consumers purchased. “It is undoubtedly true that a great deal of artificial butter has been thrown on the market that is carelessly made,” Wiley wrote.

  Still, the Agriculture Department did not offer a blanket condemnation. The division chemists found that if animal-fat oleomargarine was made with care, the product was in many ways comparable to butter, with “nearly the same chemical composition in digestibility. There may be a slight balance in favor of butter but for healthy persons this difference can hardly be of any considerable consequence.”

  The primary health concerns, the investigation found, derived from dyes used to improve the look of butter and margarine. Traditional butter dyes had been vegetable products: annatto (from the fruit of a South American tree), turmeric, saffron, marigold, and even carrot juice—all benign if pure. But suppliers were adulterating the dyes. Annatto, the most popular, often had brick dust, chalk, and traces of red ocher mixed into it. Processors were also using industrial dyes such as chromate of lead, already notorious for instances of lead poisoning from eating
yellow candy. Similar problems occurred in cheese, where manufacturers used red lead to enrich color. In all food products, the report warned, “the use of mineral coloring like chromate of lead is highly reprehensible.”

  The Division of Chemistry included in the report descriptions of several methods for testing products. With the use of a microscope and a little knowledge of what to look for, it was easy to tell if a spread was butter or margarine. At the molecular level, butter displayed long, delicate, needlelike crystalline structures. Melted, it appeared as shorter needles gathered in bundles. Beef fat crystals, by contrast, appeared as spiky, needle-studded globes, like a “sea urchin or hedgehog.” Oleomargarine was a messy tumble of crystalline clumps resembling flattened cauliflowers. Complete with photos, these were handy instructions for anyone with access to a microscope but of little use to the average consumer in 1887.

  That same year a New York chemist, Jesse Park Battershall, published a book called Food Adulteration and Its Detection, which offered easier home tests. Some, such as one to detect adulterations in tea, could be conducted in any home kitchen. Battershall recommended simply putting the “tea” into a cylinder containing cold water, capping it, and shaking it hard. Ingredients other than tea would form either a scum on the top or a sludge on the bottom. “In this way, Prussian blue (cyanide, used as dye), indigo (another dye), soapstone, gypsum, sand, and turmeric can be separated,” Battershall explained. And, he added, housewives should not be too surprised to find them there.

  Against the backdrop of rising public concern, and with Commissioner Coleman’s support, Wiley resolved to continue raising awareness about impurities and fakery in American food products. The 1887 issues of Bulletin 13 examined three broad areas of food and beverage manufacture, dairy being only the first. The second subject had gotten far less attention—certainly nothing like congressional hearings, let alone a regulatory law—but it concerned products even more rife with fakery. “Could only a portion of the unfortunate dislike for oleomargarine be directed toward the spices?” Wiley wrote in an official letter to his boss.

  Two

  CHEATED, FOOLED, AND BAMBOOZLED

  1887–1896

  And daintily finger the cream-tinted bread,

  Just needing to make it complete

  At the U.S. Customs Service laboratory in New York City, where he was a supervisor, colleagues described chemist Jesse Park Battershall as a rather shy, meticulously cautious scientist. Yet Battershall’s 1887 book on food adulteration seethed with outrage over virtually every product that American grocers sold. His list included milk and butter, of course, as well as cheese, coffee, chocolate and cocoa, bread, and “baker’s chemicals” (baking powders and sodas), and an appalling amount of candy laced with poisonous metallic dyes. He had tested 198 samples of candy and found that a full 115 were tainted by the use of dangerous dyes, mostly arsenic and lead chromate. Forty-one out of forty-eight samples of yellow and orange-colored candy, in fact, contained lead. He had warned of cyanide, indigo, soapstone, gypsum, sand, and turmeric in teas, but he’d also found that the leaves themselves represented a variety of cheats. In standard black and green teas, Battershall found mixtures of backyard leaves from rosebushes, wisteria vines, and trees, including beech, hawthorn, willow, elm, and poplar.

  But even this paled, according to the Chemistry Division’s report on spices and condiments, to the fakery involved in these products. This was not entirely a surprise. Ground, flaked, or powdered food products had long been known as easy to cut with something else or be replaced entirely by some other, cheaper powder. Ancient Roman documents tell of first-century BCE merchants selling mustard seed and ground juniper berries as pepper. In thirteenth-century England, there were tradespeople called garblers (from an old Arabic word for sieve), hired to inspect imported spices and sift out grain and grit. Predictably, some garblers, those in the employ of unscrupulous importers or merchants, did just the opposite, mixing ground twigs and sand into the spices themselves. Eventually the very word “garble” came to mean mixing things up incorrectly.

  By the late nineteenth century, some countries—notably Great Britain—had laws regulating spices. Clifford Richardson, the scientist whom Wiley assigned to take the lead on the USDA spice study, noted in the bulletin that the Dominion of Canada, then still part of the British Empire, did a much better job monitoring foodstuffs than the United States did. But even so, a recent Canadian marketplace survey had found widespread and rather astonishing levels of fakery.

  Richardson, writing in the bulletin, tallied up the damage: Commercially sold dry mustard registered at 100 percent adulteration, allspice at 92.5 percent, cloves 83.3 percent, and ginger 55.5 percent. The Canadian analysis also provided some specifics. For instance, scientists there had found a mixture of ground wheat chaff colored with red clay, with a little inexpensive cayenne pepper thrown in, masquerading as ground ginger. When Richardson examined American-sold “ginger,” he discovered burned shells, cracker dust, ground seed husks, and dyes. He also noted that some states—Massachusetts, New York, New Jersey, and Michigan—did require spices to be tested for authenticity and purity and that the results had been appalling. In 1882 Massachusetts regulators had found 100 percent adulteration of ground “cloves,” which seemed to be mostly burned seashells. The same year, the state’s black “pepper” samples turned up as largely charcoal and sawdust.

  When Wiley’s team did its own ground pepper analysis, the chemists found it difficult to list or even figure out everything that was in the mixtures: sawdust, cereal crumbs, gypsum, potato scraps, hemp seed, and “to an astonishing extent” powdered olive stones, walnut shells, almond shells, “mineral matter,” sand, soil, and more. The chemists mockingly called the spice “pepperette.” A new, inexpensive product labeled “pepper dust” they found to be literal dust, apparently common floor sweepings.

  “Pepper is more in demand than any other spice and is in consequence more adulterated,” explained Richardson. Consumers were too trusting and didn’t examine the spices they bought. With his naked eye, he’d been able to pick out crumbled crackers and charcoal in a black pepper sample. He could also pick out the crumbles of brick dust in so-called cayenne pepper. Using a microscope, he detected sawdust in the spice mixes, distinguishing the larger tree cells from the finer cellular structure of a peppercorn.

  Some manufacturers took a one-size-fits-all approach to fakery. One New York firm—a purveyor of pepper, mustard, cloves, cinnamon, cassia, allspice, nutmeg, ginger, and mace—had purchased five thousand pounds of coconut shells a year for grinding and adding to every spice on that list.

  Other cheats made “mustard” by mixing water with coarsely ground flour or crumbled gypsum, a mineral commonly used to make plaster. To give the resulting sludge a mustardlike tint, Martin’s yellow (more technically 2,4-Dinitro-1-naphthol yellow), a coal-tar dye containing benzene (and related to naphthalene, a primary ingredient in moth balls), was added. The chemists had discovered this by adding alcohol to the “mustard” powder, separating out the dye, and analyzing its formula.

  Richardson predicted that if manufacturers realized how easily this fakery could be detected, they would find another, perhaps even more harmful, chemical to substitute. With bureaucratic understatement, he noted that spices offered “large scope for inventive genius.”

  With Wiley’s collaboration, he wrote an overtly political call to action in his conclusion to the report. It would be difficult to prevent the rise of such “manufactured” food, he said, “without some governmental action.” Crooked spice processors undercut the prices of their honest competitors, leaving no economic incentive for pure products. “When proper legislation has found a place on the statute-books,” the report continued, “the manufacturers will find themselves in a position where, without detriment to themselves, they can all unite in giving up the practice.” Wiley underlined the point in the letter to Coleman that served as the preface to t
he spice report: “The necessity for some means for the suppression of the present universal sophistication of spices and condiments seems urgent.” Richardson was so disgusted by his findings that he asked to be transferred to another line of research and spent the next years analyzing seeds and grasses.

  * * *

  —

  The third and final Bulletin 13 report of 1887 was devoted to “Fermented Alcoholic Beverages, Malt Liquors, Wine, and Cider.” That particular investigation was prompted at least in part by a growing concern about salicylic acid, a preservative that wine bottlers increasingly used—and in increasing amounts—to lengthen the shelf life of their products.

  Found in plants such as meadowsweet, wintergreen, and most commonly the bark of willow trees, this natural substance had been used as a pain reliever dating back to ancient Egypt. The Greek physician Hippocrates praised it in the fifth century BCE and Native American healers knew it well. But the name “salicylic acid” or “salicin” was coined after early-nineteenth-century scientists learned to extract the pure compound from the white willow, Salix alba, in the early nineteenth century. They also discovered a side effect. When ingested in high doses, pure salicylic acid caused gastric bleeding.

  A few decades later, in Germany, pioneering organic chemist Hermann Kolbe and lab partner Rudolf Schmitt developed an economical method for synthesizing large quantities of salicylic acid in a laboratory. They used sodium carbonate as a base for creating needle-shaped crystals that could be replicated time and time again and then ground into a fine powder. Laboratory workers learned to avoid getting a whiff of the crystalline dust, which produced almost instant irritation of the mucous membranes in the nose and set off a sneezing fit. Seeking to establish a safe dose, Kolbe used himself as a test subject, ingesting one-half to one gram daily over several days with no apparent ill effect. His conclusion was that the compound was basically safe if administered in cautious doses.