In the third-floor laboratory Gettler and his assistants duplicated the tests until they were sure of one fact: the tissue of a normal brain always contains a trace of ethyl alcohol as a result of normal metabolic processes. And trace was the right word: at most, natural alcohol content in the brain is 2/1,000th of a percent.
But that number, that barest gleam of alcohol, gave them a baseline reading to compare to the brains of people who had consumed alcohol. That was the real question anyway—how much alcohol was added when people drank, when they drank a little, and when they drank way too much? How much did it take before the brain was, one might fairly say, soaking in the stuff?
“WELL, Doctor, isn’t it a fact that I can give the same amount of alcohol to two people, and one may become intoxicated and the other not?” Gettler was forever being asked that question by attorneys representing clients charged with public intoxication or drunk driving or any of the alcohol offenses that had continued, without fail, to plague the criminal justice system during Prohibition.
“My answer to that is, we are not analyzing what the man gets to drink.” As he’d said for years, he didn’t care about that, didn’t care what was in the bottle, the stomach, or the intestines, didn’t even fully count the blood-alcohol level. “We are analyzing for the alcoholic content of the brain. Once it gets to the brain, it has an effect, and that effect is proportionate to the amount present” in those tissues.
To correlate the amount of alcohol in the brain with drunken behavior, Gettler had pored over the remains of people identified as drunk at time of death, people who’d fractured their skulls falling down stairs, meandered in front of a hurrying automobile or tumbled onto train tracks and been collected in pieces. He’d matched them to witness statements gathered by the police, descriptions of those weaving, stumbling, and joking their way toward their final conclusion. For comparison, he’d also looked at the brains of people who’d consumed no alcohol, patients who’d died at Bellevue after lengthy hospitalizations, people whose tissues were guaranteed to contain no more than the normal baseline of ethyl alcohol.
All of that—the measurements of alcohol in the brain, the injuries, the behavior, the full context of those deaths—would go into the paper that he and Tiber published, which would be widely acclaimed as creating the first scientific scale of intoxication.
Gettler’s scale used the numerical plus sign + to indicate each level of drunkenness. Each plus stood for a level above normal: baseline brain alcohol plus one, plus two, plus three, or plus four. Spelled out, the scale read like this:+ “In all cases in which there was an alcoholic content of the brain below 0.1 percent, the patients showed no obvious alcohol impairment.”
++ From 0.1 to 0.25 percent alcoholic content. Subjects showed slight inebriation: they were a little more aggressive than normal, a little less cautious in their behaviors. One man who had knocked back a couple of drinks, pitched into a bar fight, and was stabbed to death had a ++ rating.
+++ From 0.25 to 0.4 percent alcoholic content. In the hours before death, subjects were unsteady on their feet, loud, and judged drunk by everyone who saw them. A typical +++ was autopsied after he’d fallen down the stairs while drunk and broken his neck.
++++ From 0.4 to 0.6 percent alcoholic content. These subjects had died after becoming falling-down drunk. They had consumed so much liquor that they succumbed to ethyl alcohol poisoning, usually within several hours after reaching a hospital.
Gettler’s one-to-four scale of drunkenness was ideal for establishing intoxication at time of death, working as it did with brain tissue.
A companion rating scale—for the drunk drivers who survived a crash—would follow shortly. In 1931, an Indiana University toxicologist named Rolla Harger invented the “Drunk-o-Meter,” a device in which drinkers blew into a balloon, allowing chemists to analyze the vapors in their breath. Harger’s system, like Gettler’s, assigned intoxication values linked to behaviors measured at each level of chemical results.
The Prohibition era had been a great source of material for building an excellent science of alcohol intoxication.
IN NEW York City, according to press reports, two men in a First Avenue speakeasy died from alcohol poisoning in August. Six died and twenty-one were hospitalized in September, all sickened by wood alcohol served in a Brooklyn bar. In October another twenty were dead in Newark. They’d apparently skipped the redistilling process entirely and guzzled straight industrial alcohol.
In December 1930 James Doran acknowledged that the death rate was worsening, at least for “a certain type of person with uncontrollable appetite.” The latest holiday drink was a mixture that included the alcohol used in antifreeze formulas. The antifreeze cocktail was a favorite of rail riders, traveling laborers who liked to sneak their rides on cargo trains. They called the drink “derail” for its near-instant brand of intoxication. It killed a few of them, sure, but they were used to the effects of borderline alcohol. The rail riders admitted—or sometimes boasted—that they’d drunk Sterno on occasion, or bay rum aftershave if there was nothing else to be found.
“These deaths have been attributed to so-called poison liquor,” Doran acknowledged, “namely denatured alcohol, manufactured under government supervision.” In response to continued criticism—Charles Norris had taken to describing the United States as the land of hypocrisy—Prohibition chemists had invented a new formula, to be introduced in 1931, which they thought would be repulsive without actually killing people.
The new formula mixed petroleum products into the alcohol, pretty much the sludge left behind during gasoline processing. Noxious with the rotten-egg reek of hydrogen sulfide, they figured it would be unpleasant enough to discourage anyone conscious enough to get a whiff. To publicize the new approach, the Treasury Department invited journalists to come taste it. One experienced newshound immediately identified it as containing either benzene or ether. “It’s not as bad as some of the stuff you’ve been drinking,” Doran replied ironically, as the writer, clearly accustomed to the more usual versions of poisoned alcohol, chugged down another sample.
WHY WERE some people, journalists notoriously among them, seemingly able to guzzle alcohol without obvious effect? Why didn’t everyone fall flat after a marathon of martinis? These questions were asked enviously by hungover friends, resentfully by dry advocates. And they had perplexed scientists for decades.
Gettler could—and did—cite puzzled research papers dating back to the turn of the century. One of the most provocative, published in 1908, was a study of rats and rabbits that were provided a regular supply of ethyl alcohol. After the scientists had created a colony of alcoholic animals, they offered the same laboratory cocktails to animals with no previous exposure. The novice drinkers became stumbling drunk on the same amount of alcohol that no longer had any effect on the habituated rats and rabbits.
By doing a series of blood draws, the scientists found that their habitués had somehow learned to better process their drinking binges. The unaccustomed animals absorbed 20 percent more alcohol into their bloodstream in the first two hours than did the practiced ones. By the end of a day, first-time drinkers had blood-alcohol levels 66 percent higher than the experienced imbibers.
Working with his talented toxicology students, Abe Freireich (who would later become chief medical examiner for New York’s Nassau County), Gettler decided to see if he could provide a better understanding of how alcohol is metabolized by different kinds of drinkers. He chose dogs for their experiments.
Gettler and Freireich assembled a research colony of twenty-four animals, half destined to become canine alcoholics. They started by giving those twelve dogs drinks that were 98 percent water and 2 percent ethyl alcohol. Over the next two months the alcohol portion was increased to 30 percent. When Gettler and Freireich were sure that their dogs were habituated to alcohol, they started the actual experiment; it would continue for two years before they were satisfied with the results.
They started simply, comp
aring the effects of alcohol on a dog accustomed to ethyl alcohol to one that had never touched it before. Both animals were given that 30 percent alcohol solution in increasing amounts. At half a cup, the chronic drinker was unfazed, as “playful as ever.” His companion, though, developed a slightly staggering walk and then sat down and refused to move. When they doubled the amount, even the dog accustomed to alcohol was affected: when set loose in the test chamber, his walk was a little unsteady, and he showed a preference for sitting down—although he would come when called. The other dog wove one precarious line around the room and then passed out, waking up almost twenty-five minutes later.
The two chemists repeated the comparison eleven more times, waiting a month or so between each test. The pattern never altered—ethyl alcohol visibly impaired the first-time drinker but had far less dramatic effect on the chronic imbiber. After each set of observations, the dogs were killed with illuminating gas. Samples of their brains, livers, blood, and spinal fluid were analyzed. Every time, “although the dogs were of the same weight, received the same quantity of alcohol, lived approximately the same length of time after the alcohol administration,” there was less alcohol content in the organs of the habitués than in those of the novice drinkers; on average, the scientists found twice as much alcohol in their novice dogs.
To be sure, the habituated dogs showed signs of intoxication, but they did so more slowly. Once the amount of alcohol in any dog’s brain climbed above 0.25 percent (+++ or more on the human scale), the animal became obviously tipsy, just as people did. This suggested something important to Gettler—that no one developed an actual immunity to the effects of alcohol. “If acquired resistance was the cause of tolerance, then the picture obtained from the analyses should be quite different,” he pointed out.
If their alcohol-experienced dogs had become resistant, they should have stayed perpetually sober. If they were immune to intoxication, then those +++ brain levels should have left them unaffected. But even the most hard-headed dogs reached a point, when enough alcohol had reached their brains, that they also appeared as stumbling drunks.
Essentially, the Gettler and Freireich study showed how the body of a habitual drinker adjusted, became more efficient in metabolizing ethyl alcohol. The liver generated more enzymes to break the alcohol down. More liquor was processed out, so less entered the bloodstream. It thus took longer for an intoxicating amount to accumulate and reach the brain. That wasn’t pure good news. Chronic drinkers needed to take in more alcohol, sometimes a lot more, to reach the level in the brain that produced intoxication. They usually responded by drinking more.
That was why experienced drinkers were credited with having such hard heads for liquor—they could drink at the same rate as their friends and be less affected. But it was not some kind of magical immunity that they acquired; rather, it was what Gettler and Freireich called the deceptively healthy, ultimately destructive internal chemistry of “the habitual drunkard.”
ON THE first day of January 1931 the department store heir Lee Adam Gimbel jumped from the sixteenth floor of the Yale Club, having seen his fortune disappear with the economic downturn. In February the once-wealthy owner of a shoe company poisoned himself in a downtown hotel. In March a plumbing supply manager jumped from the ninth floor of the New York Athletic Club. In April the broker-husband of heiress Jessie Woolworth killed himself with mercury bichloride. In May a law firm partner dived out of his third-floor room at the Hotel Commodore, dying of a fractured skull.
The stock market, already hammered in the Black Tuesday crash of 1929, had sputtered erratically before free-falling again, pushed even deeper into trouble by economic collapses in Europe and alarming crop losses across the Great Plains states, as what appeared to be a persistent drought settled into place. During 1930 stock values had fallen a full 40 percent and two thousand banks had failed nationwide. In 1931 those numbers got worse.
In his annual report, issued that spring, Charles Norris announced that New York City had reached a new high in violent deaths the previous year—6,525 across the five boroughs, driven by the leaping suicides that followed the economy’s downward spiral. Self-terminations totaled 1,471, an average of three deaths per day. That meant that nearly one-fourth of the city’s violent deaths could be attributed to despair.
DESPITE THE bleak increase in the medical examiner’s workload, the department’s budget had been slashed in response to the city’s own economic struggles.
“At the present time I am spending nearly $300 a month from my own personal funds for work which in my opinion has absolutely to be done to keep up the work of the office,” Charles Norris wrote to the mayor’s office in early 1932. “This is no pleasure to me in these depressed times, for I suffer from them just as much as anybody else. You can rest assured that I would not give out this amount of money monthly unless I considered it of importance.” His department had always run on a minimal budget. But now it was less than minimal.
Gettler was doing two men’s work on a salary of less than $4,000 a year; the city was refusing to fund the position of assistant chemist due to a budget shortfall, even though the salary was only $100 a month. Norris was paying a young chemist’s half-time salary out of his own pocket, because “he would be down and out without it.”
He was tired enough and angry enough to write a letter to the New York Times and complain about the mediocre administrators of a so-called great city and the difficulty of maintaining a world-class medical examiner’s office: “It has been an uphill battle from the start.” Norris wrote more privately to the mayor’s office, announcing that he needed a vacation. He was going to spend his money on something besides subsidizing the city. He and Mrs. Norris were leaving on a steamer trip to the West Indies for a month’s respite.
He would be back in late March. But if the New York City administrators hadn’t learned to appreciate good forensic science by that time, he would have to reconsider whether the medical examiner’s job was worth keeping.
“Altogether,” Norris wrote, “the situation is becoming one where I doubt the advisability of remaining in the office permanently.”
IT WASN’T JUST the budget frustrations. Norris had a depressing sense that he and other forensic scientists were trying to teach the rest of the world the same lessons, over and over again, and that the rest of the world was not really paying attention.
Harrison Martland was still trying to impress upon the rest of his profession that radium poisoning posed a public health hazard. The work had proved disheartening. He’d recently been shocked when a leading authority on industrial diseases “told me that this disease is an obscure one about which little is known.
“I cannot agree with this statement,” he wrote stiffly in the introduction to a 1931 paper on the hazards of luminous paint, detailing the work done by himself, by Gettler, by physicians working for the Consumers’ League, by researchers at Harvard and Yale, and by enough scientists that no one should call the problem obscure.
A steady drumbeat of dial painter deaths continued: the previous fall, Martland had logged the fifth death since the legal settlement by the New Jersey plant, a twenty-seven-year-old woman who had been bedridden at her sister’s home for months after her hips crumbled. She left behind a husband and a three-year-old daughter. One of the Radium Girls from the lawsuit was also dead; another was in the hospital, her right leg having fractured as she walked across a room.
When New Jersey newspapers discovered that Martland was keeping a list of these deaths, which journalists described as “a kind of doom book,” he was furious. Martland fielded dozens of calls asking whose names were in the book. He blamed the lawyers for informing the press and decided to do no further work for them. He feared that the barrage of publicity was turning the women into freaks.
“I naturally don’t like to talk of it,” Martland snapped to a curious reporter.
The list was intended as a record of the length of time it might take for symptoms to appear. As he’d
noted in his recent paper, radium poisoning took two distinct forms. One was acute: these early deaths were characterized by severe aplastic anemia, a rapid disintegration of the bones of the jaw, spreading sores on the lips and tongue, and opportunistic bacterial infections.
The “later group,” as he called it, had probably absorbed less radium at the start. Those workers sickened more slowly, gradually developing “low-grade, crippling lesions in their bones,” which Martland thought were caused by the buildup of radioactive deposits in the bones themselves, followed by collapse within the bone marrow. Neither picture was a happy one. He hated the way reporters kept describing his data as a catalog of the the doomed.
“This list would hardly bring pleasant thoughts to those whose names were on it.”
NORRIS RETURNED from his West Indies cruise, cheered by golden weather and legal rum. On March 31, only a couple of weeks later, another radium death occurred, one that would boost the issue out of its medical backwater.
Here was an “important” radium death, the difference in response lying in the social class of the deceased. This was no Italian-American factory worker from New Jersey but a fifty-two-year-old millionaire, an industrialist, an athlete, and a member of the social elite. Eben M. Byers was chairman of A.M. Byers Iron Factory and a director of the Bank of Pittsburgh and of the Pennsylvania and Lake Erie Dock Company.