But What if We're Wrong? Thinking About the Present as if It Were the Past
One of the years I lived in Akron was the year 2000. Technically, that was sixteen years ago. But those sixteen years might as well be 160, and here’s proof: When I wasn’t watching the hedgehog from my balcony, I was watching MTV, and they still occasionally played music. The video that seemed most pertinent at the time was “Testify” by Rage Against the Machine, directed by Michael Moore. Now, I was twenty-eight years old, so I considered myself too mature to take Rage Against the Machine seriously (that seemed like something you did when you were nineteen) and too cool to like their music as music (that seemed like something you did when you were twenty-seven). But I was still dumb enough to trust Michael Moore, so I liked this video. The premise was that George W. Bush and Al Gore were the same person, controlled by the same puppet masters and working for the same interests. We see a clip of Bush expressing support for the death penalty, followed by a clip of Gore saying the exact same thing. Bush extols the virtues of free trade, mirrored by Gore praising free trade. Bush is seen dancing with balloons and Gore is captured in a conga line, and then RATM jams econo in a wood-paneled studio (to a song that is, in retrospect, propulsive and committed, taken from an album I probably underrated). We get a supercut of newsmakers in quick succession—Sonny Bono, Ken Starr, the pope, Bill Clinton—with the ingrained implication that they are all complicit in some big-money boondoggle, and that all politicians and parties are fundamentally interchangeable. It ended with a message from Ralph Nader.
Part of the reason I appreciated this video was that I agreed with it. The other part was that the message seemed so self-evidently true that I couldn’t believe a group as politically impractical as Rage Against the Machine was the band making it (“Tom Morello is finally embracing pragmatism,” I pragmatically assumed). I stayed up until three a.m. on November 8, watching the results of an election that was closer than I ever imagined possible. Bush won Ohio by about 165,000 votes. Gore lost his home state of Tennessee and was upset in New Hampshire, where Nader got 4 percent of the ballots. Florida was called for Bush at 2:17 a.m., providing him a victory much of the country did not accept as legitimate. I watched the whole event like it was a well-played Olympic hockey game between Norway and Finland. I loved it with emotional and cerebral distance, for totally apolitical reasons. The ultimate outcome didn’t bother me, because—like Michael Moore and Zack de la Rocha—I naïvely viewed these men as transposable. Most Americans did, as is illustrated by the fact that no one seemed particularly outraged when the Supreme Court upheld Bush’s victory, except for those performative armchair revolutionaries who express reflexive outrage over everything. I don’t remember any windows getting shattered or banks being burned.
Obviously, no one thinks like this now. In fact, they don’t even think that this was how they thought at the time: Huge swaths of the populace have retroactively convinced themselves that their reaction to the 2000 election was far more extreme than all evidence suggests. When Bush took office in January, it felt perfunctory. That September, the world changed completely. America adopted a level of political polarization that had not existed since the Reconstruction, which now feels like the normal way to think about society. This, I grant, is no profound revelation: The world evolves, so perspectives evolve with it. Two US cities experienced a traumatic event, and that event cascaded into smaller, unrelated events. But just because something can be explained doesn’t mean it’s simple. Sixteen years ago, it was reasonable to believe there was no meaningful difference between Democratic leadership and Republican leadership. That ended up being wrong. But did it become wrong, or was it already wrong in 1999? And if this kind of partisan ambivalence eventually returns to prominence—and it almost certainly will—does our current period of polarization become an aberration? Are we actually wrong now?
Let me get back to that hedgehog: The view from my Akron apartment61 faced the back of the building. There was an apple tree in the yard, and the (comically obese) hedgehog would sit underneath its branches and longingly stare at the low-hanging fruit. It often seemed like he was torturing himself, because there was no way a hedgehog of his ample girth could reach an apple two feet above his head. Yet every time he did this, he knew what he was doing. Every time, or at least every time I happened to be watching, an apple would eventually fall to the ground, and he would waddle over and eat it. He was a brilliant goddamn hedgehog. I couldn’t stop thinking about it. When I went on dates—and maybe this explains why I was single—I would always talk about this hedgehog, inevitably noting a platitude that often applies to politics. The clever fox knows many things, states the proverb, but the old hedgehog knows one big thing. “I finally understand what that means,” I’d tell the confused woman sitting across from me. “The old hedgehog knows that gravity applies to fruit.” This banter, I must admit, did not lead to any canoodling (although most women did laugh, and one literally said, “You sure know a lot about hedgehogs,” which I almost count as a common-law marriage). It did, however, lead to a lot of casual discussion about what this phrase is supposed to mean. The origin of fox vs. hedgehog is Greek, but it was popularized by the British essayist Isaiah Berlin (note: These were not details I knew in 2000). In a plain sense, the adage simply means that some people know a little about many subjects while other people know a lot about one subject. Taken at face value, it seems like the former quality should be preferable to the latter—yet we know this is not true, due to the inclusion of the word “but.” The fox knows a lot, but the hedgehog knows one singular thing that obviously matters more. So what is that singular thing? Well, maybe this: The fox knows all the facts, and the fox can place those facts into a logical context. The fox can see how history and politics intertwine, and he can knit them into a nonfiction novel that makes narrative sense. But the fox can’t see the future, so he assumes it does not exist. The fox is a naïve realist who believes the complicated novel he has constructed is almost complete. Meanwhile, the hedgehog constructs nothing. He just reads over the fox’s shoulder. But he understands something about the manuscript that the fox can’t comprehend—this book will never be finished. The fox thinks he’s at the end, but he hasn’t even reached the middle. What the fox views as conclusions are only plot mechanics, which means they’ll eventually represent the opposite of whatever they seem to suggest.
This is the difference between the fox and the hedgehog. Both creatures know that storytelling is everything, and that the only way modern people can understand history and politics is through the machinations of a story. But only the hedgehog knows that storytelling is secretly the problem, which is why the fox is constantly wrong.
[2]“History is the autobiography of a madman,” wrote Alexander Herzen, a nineteenth-century Russian who helped define socialism and agrarian populism. Of course, I did not discover this slogan by reading about socialist farmers. I saw it on a promotional T-shirt. The shirt promoted Hardcore History, a podcast conducted by a man living in Oregon named Dan Carlin. Unlike most podcasts, Hardcore History is not a conversation or an interview or a comedic debate—it’s just one guy sitting in a studio, talking about history. And Carlin talks a long time: His lecture on World War I clocks in at over four hours. He doesn’t classify himself as a historian, because he doesn’t have a PhD. (“There’s a real divide between historians and non-historians,” he says. “I don’t want historians to think that I’m a historian, if you know what I mean.”) His mother, retired actress Lynn Carlin, is still more famous than he is.62 But his podcast is fascinating, mostly due to Carlin’s knowledge but also because of his perspective. If my goal with this book is to think about the present as if it were the distant past, the goal of Carlin’s podcast is to think about the distant past as if it were the present. When he talks about historical periods that seem retrospectively unhinged—the Red Scare, the era of Attila the Hun, the administration of Teddy Roosevelt—he resists the urge to view these events as insane aberrations that could never exist in modernity. Instead, he places himself inside the l
ife of long-dead people he’s never met and tries to imagine how the world must have appeared to them, at that time and in that place. Which, he concedes, is antithetical to how serious history is now conducted.
“If someone pursued history at Harvard University fifty years ago, it would have been clumped in with the humanities, mixed in with religion and law and language and art and those kinds of subjects,” Carlin says. “But if you did this today, it’s much more likely to be mixed in with the soft sciences, with archaeology and anthropology and those kinds of things. The good part about that change is that historians are much more diligent about facts than they used to be, and much more careful and much more quantified, and they’re likely to talk about things like radiocarbon dating. They sound more like archaeologists. But the downside is—when you’re talking about stories that involve human beings—there’s a lot of it that’s just not quantifiable.”
What Carlin is describing, really, is a dispute over the way true stories should be told. And this is important, because there really isn’t a second option. Storytelling’s relationship to history is a little like interviewing’s relationship to journalism: a flawed process without a better alternative. We are socially conditioned to understand the universe through storytelling, and—even if we weren’t—there’s neurological evidence that the left hemisphere of our brain automatically organizes information into an explainable, reassuring narrative.63 This is how the world will be understood, even if we desire otherwise. So which mode of storytelling is preferable? Is it better to strictly rely on verifiable facts, even if that makes the story inherently incomplete? Or is it better to conscientiously interpret events, which often turns history into an informed opinion? According to Carlin, the former methodology is becoming increasingly dominant. Barring an unforeseeable academic reversal, one can infer that this fact-oriented slant will only gain momentum. It will eventually be the only way future historians consider the present era of America. And that will paint a much different portrait from the interpretive America we’re actually experiencing.
Near the end of our phone conversation, Carlin and I start talking about Ronald Reagan. “I don’t know what your views are, Chuck, but I lived through that period,” says the fifty-year-old Carlin. “I don’t understand the hero worship at all. I can’t get my mind around it.” We then run through the various problems with Reagan’s presidential tenure, namely the lowering of the top marginal income tax on the super-rich from 70 percent to 28 percent and (what Carlin considers) the myth of Reagan’s destruction of the Soviet Union. “The reason the Soviet Union fell was that it was a system designed on an early-twentieth-century model that could not incorporate the changes necessary for the late twentieth century,” he explains. “The idea that Reagan somehow foresaw that is, to me, insane.” These points, along with his disempowering of labor unions and the deregulation of business, tend to be the tangible aspects of Reagan’s presidency most often noted by presidential scholars. He was, factually, a bad president. But this contradicts something obvious. “I think that if you polled a bunch of random Americans,” concedes Carlin, “a significant number would think Reagan belongs on Mount Rushmore.” Even as a decomposed corpse, Reagan remains an extremely popular leader, at least among those who liked him when he was alive. His 1984 win over Walter Mondale was the most lopsided landslide in electoral history, and he exited office with an approval rating of 63 percent.64 He was the ultra-hedgehog, obsessed with only one truth: If people feel optimistic about where they live, details don’t matter. But here’s the thing—you need to have an active, living memory of Reagan for any of this to seem plausible. You need to personally remember that the 1980s felt prosperous, even when they weren’t. Every extension of mainstream popular culture expressed this. The 1980s felt prosperous even if you were poor. Somewhat ironically, Carlin can’t reconcile Reagan’s legacy, because he has distanced himself from his own memory. He’s unconsciously applied a fact-based perception, just like those (currently unborn) historians who will dictate reality in the year 2222. Those historians will look back at the 1980s and presume the US populace must have suffered some kind of mass delusion, prompting them to self-destructively lionize a president who—factually—made the country worse. Within the minds of those historiographers, Reagan will be defined as an objectively bad president . . . except, of course, for that eight-year period when he actually was president, when he was beloved and unbeatable and so emotionally persuasive that—twenty-five years after he left office—his most ardent disciples sincerely suggested his face be carved into a South Dakota mountain. And that will make no narrative sense, except to Herzen’s self-published madman.
[3]These illustrative examples, however, are still relegated to the pot of small spuds. The election of 2000 was less than a generation ago (as I type this sentence, those born the night it happened still can’t vote). Reagan’s success or failure is part of history, but it’s still recent history—he will be classified, at least for the next twenty-five or so years, as a modern president, subject to the push and pull of many of the same people who pushed and pulled when he was sitting in the Oval Office. And even when all those pundits are finally gone, Reagan’s merits will continue to incrementally rise and incrementally fall, simply because he held the one job that is re-ranked and re-imagined every single year. The way we think about presidential history is shifting sand; it would be like re-ranking the top twenty college football teams from the 1971 season every new September and having the sequential order (somehow) never be the same. When I was in college, everyone told me the worst president of all time was Ulysses S. Grant. But we now consider Grant to be merely subpar. The preferred answer to that question has become James Buchanan. On the final day of 2014, U.S. News & World Report classified Grant as only the seventh-worst president of all time, almost as good as William Henry Harrison (who was president for only thirty-one days). I have no idea how this happened. If Grant can manage to stay dead, he might become halfway decent. He could overtake Grover Cleveland!
When we elect the wrong president (or if we remember that president inappropriately), certain things happen. But nothing that can’t be undone. If Buchanan truly was the worst president, his failure has had about as much impact on contemporary society as the cancellation of Two and a Half Men. Big potatoes don’t dwell on personalities. From a political science perspective, they dwell on ideas—towering ideas that could never be changed, regardless of the arguments against them. These are things like the concept of privately owned property, freedom of speech, and voting. These are elements so imbued in the fabric of American civilization that we would never seriously debate their worth in a non-academic setting (and even then, only as a thought experiment). Yet if we are wrong about these ideas—if we are wrong about the value of our most principal values—the cost will eventually be cataclysmic. And we will just have to wait for that unstoppable cataclysm to transpire, the way the West Coast waits for earthquakes.
Every few months, something happens in the culture that prompts people to believe America is doomed. Maybe a presidential candidate suggests the pyramids were built to store wheat; maybe Miley Cyrus licks someone’s face at the Video Music Awards; maybe a student at Yale insists her college is not supposed to be an intellectual space, based on a fear of hypothetical Halloween costumes. The story becomes an allegory, and unoriginal idiots on the local news and the Internet inevitably suggest that this fleeting event is a sign that the United States is experiencing its own version of the fall of the Roman Empire. That’s always the comparison. The collapse of Rome has been something alarmists have loved and worried about since 1776, the year British historian Edward Gibbon published The History of the Decline and Fall of the Roman Empire. That was, probably coincidentally, the same year the US declared its independence. What makes the United States so interesting and (arguably) “exceptional” is that it’s a superpower that did not happen accidentally. It did not evolve out of a preexisting system that had been the only system
its founders could ever remember; it was planned and strategized from scratch, and it was built to last. Just about everyone agrees the founding fathers did a remarkably good job, considering the impossibility of the goal. But the key word here is “impossibility.” There is simply no way a person from that era—even a person as conscientious as James Madison—could reasonably anticipate how the world would change in the coming two hundred years (and certainly not how it would continue to change over the next two hundred following those, since we can’t even do that now, from our position in the middle). This logic leads to a strange question: If and when the United States does ultimately collapse, will that breakdown be a consequence of the Constitution itself? If it can be reasonably argued that it’s impossible to create a document that can withstand the evolution of any society for five hundred or a thousand or five thousand years, doesn’t that mean present-day America’s pathological adherence to the document we happened to inherit will eventually wreck everything?