Nickel and Dimed: On (Not) Getting by in America
There seems to be a vicious cycle at work here, making ours not just an economy but a culture of extreme inequality. Corporate decision makers, and even some two-bit entrepreneurs like my boss at The Maids, occupy an economic position miles above that of the underpaid people whose labor they depend on. For reasons that have more to do with class—and often racial—prejudice than with actual experience, they tend to fear and distrust the category of people from which they recruit their workers. Hence the perceived need for repressive management and intrusive measures like drug and personality testing. But these things cost money—$20,000 or more a year for a manager, $100 a pop for a drug test, and so on—and the high cost of repression results in ever more pressure to hold wages down. The larger society seems to be caught up in a similar cycle: cutting public services for the poor, which are sometimes referred to collectively as the “social wage,” while investing ever more heavily in prisons and cops. And in the larger society, too, the cost of repression becomes another factor weighing against the expansion or restoration of needed services. It is a tragic cycle, condemning us to ever deeper inequality, and in the long run, almost no one benefits but the agents of repression themselves.
But whatever keeps wages low—and I’m sure my comments have barely scratched the surface—the result is that many people earn far less than they need to live on. How much is that? The Economic Policy Institute recently reviewed dozens of studies of what constitutes a “living wage” and came up with an average figure of $30,000 a year for a family of one adult and two children, which amounts to a wage of $14 an hour. This is not the very minimum such a family could live on; the budget includes health insurance, a telephone, and child care at a licensed center, for example, which are well beyond the reach of millions. But it does not include restaurant meals, video rentals, Internet access, wine and liquor, cigarettes and lottery tickets, or even very much meat. The shocking thing is that the majority of American workers, about 60 percent, earn less than $14 an hour. Many of them get by by teaming up with another wage earner, a spouse or grown child. Some draw on government help in the form of food stamps, housing vouchers, the earned income tax credit, or—for those coming off welfare in relatively generous states—subsidized child care. But others—single mothers for example—have nothing but their own wages to live on, no matter how many mouths there are to feed.
Employers will look at that $30,000 figure, which is over twice what they currently pay entry-level workers, and see nothing but bankruptcy ahead. Indeed, it is probably impossible for the private sector to provide everyone with an adequate standard of living through wages, or even wages plus benefits, alone: too much of what we need, such as reliable child care, is just too expensive, even for middle-class families. Most civilized nations compensate for the inadequacy of wages by providing relatively generous public services such as health insurance, free or subsidized child care, subsidized housing, and effective public transportation. But the United States, for all its wealth, leaves its citizens to fend for themselves—facing market-based rents, for example, on their wages alone. For millions of Americans, that $10—or even $8 or $6—hourly wage is all there is.
It is common, among the nonpoor, to think of poverty as a sustainable condition—austere, perhaps, but they get by somehow, don’t they? They are “always with us.” What is harder for the nonpoor to see is poverty as acute distress: The lunch that consists of Doritos or hot dog rolls, leading to faintness before the end of the shift. The “home” that is also a car or a van. The illness or injury that must be “worked through,” with gritted teeth, because there’s no sick pay or health insurance and the loss of one day’s pay will mean no groceries for the next. These experiences are not part of a sustainable lifestyle, even a lifestyle of chronic deprivation and relentless low-level punishment. They are, by almost any standard of subsistence, emergency situations. And that is how we should see the poverty of so many millions of low-wage Americans—as a state of emergency.
In the summer of 2000 I returned—permanently, I have every reason to hope—to my customary place in the socioeconomic spectrum. I go to restaurants, often far finer ones than the places where I worked, and sit down at a table. I sleep in hotel rooms that someone else has cleaned and shop in stores that others will tidy when I leave. To go from the bottom 20 percent to the top 20 percent is to enter a magical world where needs are met, problems are solved, almost without any intermediate effort. If you want to get somewhere fast, you hail a cab. If your aged parents have grown tiresome or incontinent, you put them away where others will deal with their dirty diapers and dementia. If you are part of the upper-middle-class majority that employs a maid or maid service, you return from work to find the house miraculously restored to order—the toilet bowls shit-free and gleaming, the socks that you left on the floor levitated back to their normal dwelling place. Here, sweat is a metaphor for hard work, but seldom its consequence. Hundreds of little things get done, reliably and routinely every day, without anyone’s seeming to do them.
The top 20 percent routinely exercises other, far more consequential forms of power in the world. This stratum, which contains what I have termed in an earlier book the “professional-managerial class,” is the home of our decision makers, opinion shapers, culture creators—our professors, lawyers, executives, entertainers, politicians, judges, writers, producers, and editors.13 When they speak, they are listened to. When they complain, someone usually scurries to correct the problem and apologize for it. If they complain often enough, someone far below them in wealth and influence may be chastised or even fired. Political power, too, is concentrated within the top 20 percent, since its members are far more likely than the poor—or even the middle class—to discern the all-too-tiny distinctions between candidates that can make it seem worthwhile to contribute, participate, and vote. In all these ways, the affluent exert inordinate power over the lives of the less affluent, and especially over the lives of the poor, determining what public services will be available, if any, what minimum wage, what laws governing the treatment of labor.
So it is alarming, upon returning to the upper middle class from a sojourn, however artificial and temporary, among the poor, to find the rabbit hole close so suddenly and completely behind me. You were where, doing what? Some odd optical property of our highly polarized and unequal society makes the poor almost invisible to their economic superiors. The poor can see the affluent easily enough—on television, for example, or on the covers of magazines. But the affluent rarely see the poor or, if they do catch sight of them in some public space, rarely know what they’re seeing, since—thanks to consignment stores and, yes, Wal-Mart—the poor are usually able to disguise themselves as members of the more comfortable classes. Forty years ago the hot journalistic topic was the “discovery of the poor” in their inner-city and Appalachian “pockets of poverty.” Today you are more likely to find commentary on their “disappearance,” either as a supposed demographic reality or as a shortcoming of the middle-class imagination.
In a 2000 article on the “disappearing poor,” journalist James Fallows reports that, from the vantage point of the Internet’s nouveaux riches, it is “hard to understand people for whom a million dollars would be a fortune . . . not to mention those for whom $246 is a full week’s earnings.”14 Among the reasons he and others have cited for the blindness of the affluent is the fact that they are less and less likely to share spaces and services with the poor. As public schools and other public services deteriorate, those who can afford to do so send their children to private schools and spend their off-hours in private spaces—health clubs, for example, instead of the local park. They don’t ride on public buses and subways. They withdraw from mixed neighborhoods into distant suburbs, gated communities, or guarded apartment towers; they shop in stores that, in line with the prevailing “market segmentation,” are designed to appeal to the affluent alone. Even the affluent young are increasingly unlikely to spend their summers learning how the “other half” liv
es, as lifeguards, waitresses, or housekeepers at resort hotels. The New York Times reports that they now prefer career-relevant activities like summer school or interning in an appropriate professional setting to the “sweaty, low-paid and mind-numbing slots that have long been their lot.”15
Then, too, the particular political moment favors what almost looks like a “conspiracy of silence” on the subject of poverty and the poor. The Democrats are not eager to find flaws in the period of “unprecedented prosperity” they take credit for; the Republicans have lost interest in the poor now that “welfare-as-we-know-it” has ended. Welfare reform itself is a factor weighing against any close investigation of the conditions of the poor. Both parties heartily endorsed it, and to acknowledge that low-wage work doesn’t lift people out of poverty would be to admit that it may have been, in human terms, a catastrophic mistake. In fact, very little is known about the fate of former welfare recipients because the 1996 welfare reform legislation blithely failed to include any provision for monitoring their postwelfare economic condition. Media accounts persistently bright-side the situation, highlighting the occasional success stories and downplaying the acknowledged increase in hunger.16 And sometimes there seems to be almost deliberate deception. In June 2000, the press rushed to hail a study supposedly showing that Minnesota’s welfare-to-work program had sharply reduced poverty and was, as Time magazine put it, a “winner.”17 Overlooked in these reports was the fact that the program in question was a pilot project that offered far more generous child care and other subsidies than Minnesota’s actual welfare reform program. Perhaps the error can be forgiven—the pilot project, which ended in 1997, had the same name, Minnesota Family Investment Program, as Minnesota’s much larger, ongoing welfare reform program.18
You would have to read a great many newspapers very carefully, cover to cover, to see the signs of distress. You would find, for example, that in 1999 Massachusetts food pantries reported a 72 percent increase in the demand for their services over the previous year, that Texas food banks were “scrounging” for food, despite donations at or above 1998 levels, as were those in Atlanta.19 You might learn that in San Diego the Catholic Church could no longer, as of January 2000, accept homeless families at its shelter, which happens to be the city's largest, because it was already operating at twice its normal capacity. You would come across news of a study showing that the percentage of Wisconsin food-stamp families in "extreme poverty"—defined as less than 50 percent of the federal poverty line—has tripled in the last decade to more than 30 percent. You might discover that, nationwide, America's food banks are experiencing "a torrent of need which [they] cannot meet" and that, according to a survey conducted by the U.S. Conference of Mayors, 67 percent of the adults requesting emergency food aid are people with jobs.
One reason nobody bothers to pull all these stories together and announce a widespread state of emergency may be that Americans of the newspaper-reading professional middle class are used to thinking of poverty as a consequence of unemployment. During the heyday of downsizing in the Reagan years, it very often was, and it still is for many inner-city residents who have no way of getting to the proliferating entry-level jobs on urban peripheries. When unemployment causes poverty, we know how to state the problem—typically, “the economy isn’t growing fast enough”—and we know what the traditional liberal solution is—“full employment.” But when we have full or nearly full employment, when jobs are available to any job seeker who can get to them, then the problem goes deeper and begins to cut into that web of expectations that make up the “social contract.” According to a recent poll conducted by Jobs for the Future, a Boston-based employment research firm, 94 percent of Americans agree that “people who work full-time should be able to earn enough to keep their families out of poverty.”23 I grew up hearing over and over, to the point of tedium, that “hard work” was the secret of success: “Work hard and you’ll get ahead” or “It’s hard work that got us where we are.” No one ever said that you could work hard—harder even than you ever thought possible—and still find yourself sinking ever deeper into poverty and debt.
When poor single mothers had the option of remaining out of the labor force on welfare, the middle and upper middle class tended to view them with a certain impatience, if not disgust. The welfare poor were excoriated for their laziness, their persistence in reproducing in unfavorable circumstances, their presumed addictions, and above all for their “dependency.” Here they were, content to live off “government handouts” instead of seeking “self-sufficiency,” like everyone else, through a job. They needed to get their act together, learn how to wind an alarm clock, get out there and get to work. But now that government has largely withdrawn its “handouts,” now that the overwhelming majority of the poor are out there toiling in Wal-Mart or Wendy’s—well, what are we to think of them? Disapproval and condescension no longer apply, so what outlook makes sense?
Guilt, you may be thinking warily. Isn’t that what we’re supposed to feel? But guilt doesn’t go anywhere near far enough; the appropriate emotion is shame—shame at our own dependency, in this case, on the underpaid labor of others. When someone works for less pay than she can live on—when, for example, she goes hungry so that you can eat more cheaply and conveniently—then she has made a great sacrifice for you, she has made you a gift of some part of her abilities, her health, and her life. The “working poor,” as they are approvingly termed, are in fact the major philanthropists of our society. They neglect their own children so that the children of others will be cared for; they live in substandard housing so that other homes will be shiny and perfect; they endure privation so that inflation will be low and stock prices high. To be a member of the working poor is to be an anonymous donor, a nameless benefactor, to everyone else. As Gail, one of my restaurant coworkers put it, “you give and you give.”
Someday, of course—and I will make no predictions as to exactly when—they are bound to tire of getting so little in return and to demand to be paid what they’re worth. There’ll be a lot of anger when that day comes, and strikes and disruption. But the sky will not fall, and we will all be better off for it in the end.
1 Jared Bernstein, Chauna Brocht, and Maggie Spade-Aguilar, “How Much Is Enough? Basic Family Budgets for Working Families,” Economic Policy Institute, Washington, D.C., 2000, p. 14.
2 “Companies Try Dipping Deeper into Labor Pool,” New York Times, March 26, 2000.
3 “An Epitaph for a Rule That Just Won’t Die,” New York Times, July 30, 2000.
4 “Fact or Fallacy: Labor Shortage May Really Be Wage Stagnation,” Chicago Tribune, July 2, 2000; “It’s a Wage Shortage, Not a Labor Shortage,” Minneapolis Star Tribune, March 25, 2000.
5 I thank John Schmidt at the Economic Policy Institute in Washington, D.C., for preparing the wage data for me.
6 Interview, July 18, 2000.
7 “Companies Try Dipping Deeper into Labor Pool,” New York Times, March 26, 2000.
8 Personal communication, July 24, 2000.
9 “The Biggest Company Secret: Workers Challenge Employer Practices on Pay Confidentiality,” New York Times, July 28, 2000.
10 Bob Ortega, In Sam We Trust, p. 356; “Former Wal-Mart Workers File Overtime Suit in Harrison County,” Charleston Gazette, January 24, 1999.
11 See, for example, C. A. Shively, K. Laber-Laird, and R. F. Anton, “Behavior and Physiology of Social Stress and Depression in Female Cynomolgous Monkeys,” Biological Psychiatry 41:8 (1997), pp. 871–82, and D. C. Blanchard et al., “Visible Burrow System as a Model of Chronic Social Stress: Behavioral and Neuroendocrine Correlates,” Psychoneuroendocrinology 20:2 (1995), pp. 117–34.
12 See, for example, chapter 7, “Conformity,” in David G. Myers, Social Psychology (McGraw-Hill, 1987).
13 Fear of Falling: The Inner Life of the Middle Class (Pantheon, 1989).
14 “The Invisible Poor,” New York Times Magazine, March 19, 2000.
15 “Summer Work Is Out of
Favor with the Young,” New York Times, June 18, 2000.
16 The National Journal reports that the “good news” is that almost six million people have left the welfare rolls since 1996, while the “rest of the story” includes the problem that “these people sometimes don’t have enough to eat” (“Welfare Reform, Act 2,” June 24, 2000, pp. 1,978–93). homeless families at its shelter, which happens to be the city’s largest, because it was already operating at twice its normal capacity.20 You would come across news of a study showing that the percentage of Wisconsin food-stamp families in “extreme poverty”—defined as less than 50 percent of the federal poverty line—has tripled in the last decade to more than 30 percent.21 You might discover that, nationwide, America’s food banks are experiencing “a torrent of need which [they] cannot meet” and that, according to a survey conducted by the U.S. Conference of Mayors, 67 percent of the adults requesting emergency food aid are people with jobs.22
17 “Minnesota’s Welfare Reform Proves a Winner,” Time, June 12, 2000.