But What if We're Wrong? Thinking About the Present as if It Were the Past
12 Which, technically speaking, would be a triangle.
13 This is probably obvious, but—just in case it isn’t—I should mention that whenever I call something “great,” I’m not arguing that I necessarily consider that particular thing to reflect any greatness to me personally, or even that I like (or fully understand) what that something is. I’m using it more like the editorial “we”: There is a general harmonic agreement that this particular thing is important and artful, both by people invested in supporting that assertion and (especially) by people who will accept that designation without really considering why. My own taste might play a role in the examples I select, and it’s certainly possible that I might misread society’s opinion. But it’s not part of my categorization process (at least not in this particular book). I mean, I’ve never finished a Faulkner novel. I’ve never loved a Joni Mitchell record or a Bergman film. But I still know they’re great (or “great”). I don’t need to personally agree with something in order to recognize that it’s true.
14 In 1998, three of the year’s ten best-selling fiction titles were published by romance novelist Danielle Steel, who somehow managed to have at least one book in the commercial top ten from 1983 to 1999. Steel is on pace to sell a billion books in her lifetime. Yet many of these novels don’t have Wikipedia entries. They are not even critically appraised by non-critics.
15 Or, in the case of George R. R. Martin’s A Song of Ice and Fire, a TV series.
16 Here is how cult writer Dennis Cooper described the term “cult writer” to The Paris Review: “It’s a weird term because it’s complimentary but condescending at the same time.”
17 It’s easy to imagine a future where commercial success matters much more than it currently does (since that has been the overall trend for the past two hundred fifty years). But it’s equally possible to imagine a future where the only culture is niche culture, and commercial success becomes irrelevant (or maybe even an anchor).
18 Yeah, I know: This sentence is fucking confusing. But it’s more straightforward than it seems: Our present time will eventually become the past, hence the designation “present (past).” Our future will eventually become the present, hence “present (future).” It’s kind of like the prologue to Star Wars, where we are told that the following events happened “a long time ago, in a galaxy far, far away.” But the people in Star Wars shoot laser guns and travel at the speed of light, so we are forced to conclude that their past is our future.
19 Here’s a simple way to parse this not-so-simple description: Play the song “Rock and Roll” by Led Zeppelin. Based on a traditional twelve-bar blues progression, “Rock and Roll” is the only song in the Zeppelin catalog that is literally rock and roll music, unless you count “Hot Dog” and “Boogie with Stu.” Every other Zeppelin song is a sophisticated iteration of “rock,” even when the drums are reggae. Jerry Lee Lewis played rock and roll. Jerry Garcia played rock. The song “Rock Around the Clock” is a full-on rock and roll number, but the Moody Blues’ “I’m Just a Singer (in a Rock and Roll Band),” Rick Derringer’s “Rock and Roll, Hoochie Koo,” and Bad Company’s “Rock ’n’ Roll Fantasy” remain inflexibly rock (with no rolling whatsoever). John Lennon’s 1975 solo album Rock ’n’ Roll is actually a self-conscious attempt at rock and roll, while Joan Jett’s 1982 cover of “I Love Rock ’n’ Roll” professes a love for something it technically isn’t. The least ambiguous rock and roll song ever recorded is “Tutti Frutti” by Little Richard, closely followed by the Kingsmen’s 1963 cover of “Louie Louie.” The least ambiguous rock ’n’ roll song is “(I Can’t Get No) Satisfaction” by the Rolling Stones. The least ambiguous rock song ever recorded is “I Like to Rock” by April Wine.
20 Obviously, there have always been living humans between the ages of twelve and twenty. But it wasn’t until after World War II that the notion of an “in between” period connecting the experience of childhood with the experience of adulthood became something people recognized as a real demographic. Prior to this, you were a child until you started working or got married; the moment that happened, you became an adult (even if those things happened when you were eleven).
21 In fact, it’s possible to imagine a fantastically far-flung future where rock music serves as a footnote to the Beatles, where rock only matters because it was the medium the Beatles happened to pursue. Rolling Stone writer Rob Sheffield has asserted this on multiple occasions, in at least two different bars. And this isn’t a solely retrospective opinion, either—people speculated about that possibility from the moment the Beatles broke up. When CBS News covered the group’s legal dissolution in 1970, the broadcaster only half-jokingly categorized the split as “an event so momentous that historians may one day view it as a landmark in the decline of the British Empire.”
22 This contrast is complicated by those who insist the Beatles were actually a pop band (as opposed to a rock band), based on the contention that the Beatles had no relationship to the blues (which is mostly true—John Lennon once described the track “Yer Blues” as a parody). But I’m not going to worry about this distinction here, since worrying about it might spiral into a debate over “rockism vs. poptimism,” an imaginary conflict that resembles how music writers would talk if they were characters on a TV show written by Aaron Sorkin.
23 What Good Are the Arts?
24 Here’s Campbell’s description of the monomyth from his book The Hero with a Thousand Faces: “A hero ventures forth from the world of common day into a region of supernatural wonder: fabulous forces are there encountered and a decisive victory is won: the hero comes back from this mysterious adventure with the power to bestow boons on his fellow man.” This is loosely tied to Carl Jung’s idea of the collective unconscious, and a heavy degree of symbolism needs to be applied—“supernatural wonder” can be anything creative or spiritual and the “mysterious adventure” (and its subsequent “boons”) can just be a productive, significant livelihood. These kinds of metaphors tie into another of Campbell’s core philosophies—the notion that all religions are true, but none are literal.
25 This controversy was small but still hilarious. Gioia’s issue with music writing—that it’s become overly obsessed with celebrity and personality—is something music critics had privately discussed among themselves for at least forty years. Gioia just wrote it in public, from the perspective of an uninvolved outsider. But more pressingly, I’m not sure if this categorization (even if true) is remotely troubling. Lifestyle reporting, when done well, informs how art can be understood and received. It aligns with the way most consumers interact with pop music. I don’t need to analyze bass tabs in order to recognize how the bass line on “Billie Jean” is a different level of awesome. In what universe is it fun to read about time signatures and chord changes? I want to hear more about the propofol and the Elephant Man bones and the crank calls to Russell Crowe’s hotel room. I want to know about the individual who imagined those bass lines in his head.
26 I have a tendency to get fixated on the connotation and definition of specific words, but particularly the word “rock.” Sometimes I think the word “rock” is literally the most important characteristic of the entire genre, in the same way the prefix “rag” seems to be the critical detail within all ragtime music. Perhaps the rock artist who outlives the ravages of time will simply be whichever artist employs the word “rock” most prominently when titling their musical compositions, which would mean the band who’ll eventually come to symbolize the entire rock idiom will be AC/DC (who’ve somehow done this on twenty-three separate occasions throughout their career). Weirdly, this would be a better resolution than almost every other possible scenario.
27 This was probably for the best. NASA would not want the aliens to overestimate the creative role of George Harrison.
28 Some might argue that the artist I’m describing here actually sounds more like a description of Jimi Hendrix. B
ut here’s the problem: Hendrix’s exploratory genius and musical vocabulary were so unique that he ended up being the polar opposite of a “pure distillation.” He was too inventive to represent anyone but himself.
29 There’s a brilliant moment in the 1995 PBS miniseries Rock & Roll when Gregg Allman mocks the term “Southern rock,” arguing that all rock music originated in the South: “Saying Southern rock is like saying rock rock.” This was back when Allman still had his original liver.
30 I used to work at the rock magazine SPIN, a print publication that existed for twenty-seven years. Like all rock magazines, SPIN annually published an “Albums of the Year” list, diligently selected by its editorial board to exemplify how SPIN defined artistic achievement during whatever week they happened to be compiling the list. Almost all of these rankings have been completely forgotten. It’s become extremely difficult to remember what album was chosen number one from any given year, even for the people who worked there and nominated the selections . . . except for the year 1991. That was the year SPIN placed Teenage Fanclub’s Bandwagonesque above Nirvana’s Nevermind. This singular misstep is cited more often than the combined total of every other selection made throughout the magazine’s other twenty-six years, exacerbated by the fact that SPIN ultimately put Kurt Cobain on the cover ten times, seven of which came after he was dead. Because it feels so wrong in retrospect, the 1991 list is the only one that historically matters.
31 Also known as “kids who were mostly interested in other kids, or at least dogs and cats.”
32 Unless you count Stephen Hawking, who is technically a cosmologist.
33 As a species, the concept of “infinity” might be too much for us. We can define it and we can accept it—but I don’t know if it’s possible for humans to truly comprehend a universe (or a series of universes) where everything that could happen will happen. I suspect the human conception of infinity is akin to a dog’s conception of a clock.
34 Greene is not exaggerating: He said he’s had the same argument at least ten times with David Gross, the winner of the Nobel Prize for physics in 2004. “Because we can’t falsify the idea,” Gross writes of the multiverse, “it isn’t science.” In other words, because there’s no way for the multiverse theory to be proven untrue, it can’t be examined through the scientific method.
35 When I first met this guy (his name is Mike Mathog), the only thing I knew about him was how much he hated an absurdist joke I’d made in one of my early books, where I claimed the probability of everything was always 50-50 (“Either something will happen, or something will not”). Mike has since invested a lot of conversational effort into proving I am empirically wrong about this, which means he’s invested a lot of conversational effort into proving I was incorrect about something I never actually believed in the first place. In fact, I feel like he’s brought this up in half the conversations we’ve had ever since the very first night we met. So every time I see him, the odds of this specific interaction happening again are 50-50.
36 If you’re the type who hates seeing buzzwords like “paradigm shift” in every piece of cultural analysis you encounter, blame Kuhn. He didn’t invent the term, but he introduced it to most normal people. Some have argued that The Structure of Scientific Revolutions is the most-read science book of all time, among non-scientists.
37 Or maybe just a different context for the word “law.” When people mention Newton’s laws, they use the term “laws” because the rules are unbreakable. But perhaps they are unbreakable only in nature. Maybe the barriers they represent are real, but we can still break them, as technology advances beyond the parameters of the natural world.
38 As far as I can tell, the official “edge” of the galaxy cannot be defined.
39 No clue as to how this would become irrefutably known. I guess it would require an anonymous, untraceable transmission from aliens?
40 Though some are tempted to connect this theory to the scenario described in The Matrix, there is no relationship. The Matrix suggests real human bodies could serve as batteries for the projection of a simulated world. This theory suggests “real humans” are not involved at all, at least within the projection itself.
41 “There have been suggestions that there might be actual evidence [of this] rather than supposition,” Tyson told me, much to my surprise. “The evidence is this: There is something called cosmic rays that are high-energy particles moving through the universe, and they’re accelerated to very high energies in the centers of galaxies by astrophysical phenomena we think we understand—though there are a lot of holes in this. It was noticed that there was an upper limit to the energy produced by these cosmic rays. Now, in practically anything else we’ve ever measured, there’s sort of a bell curve of how such things appear. Most are in some group, then there’s a tail, and it continues off to zero. With cosmic rays, the tail’s off and there’s no broad cutoff. It was suggested that if we were a simulation, you’d have to put in a limit to something that goes on within it. And this cutoff could be the program’s pre-calculated limit for the energy level of these cosmic rays. We could be up against that boundary. It’s an intriguing thought that we’re all just one big simulation. That being said . . . it would be hard to swallow.”
42 The cosmological constant is the value of the energy density of the vacuum of space. Now, I don’t understand what that means. But it’s one of those “twenty numbers” Brian Greene mentioned a few pages back—a number that has a value so specific and so inimitable that the universe as we know it could not exist if it were even .0001 percent different.
43 This is a super-fun book, but I don’t understand how the publisher was supposed to market it: It rejects every possible conspiracy theory, yet would only be of interest to people who are actively obsessed with conspiracy theories (and who would read this book with the sole purpose of examining the details of theories the author is illustrating to be false). It would be kind of like if I wrote and researched a 390-page book about Fleetwood Mac’s Rumours LP, but my whole point was that Fleetwood Mac is not worth listening to.
44 In his seven-volume collection History: Fiction or Science? Fomenko specifically cites Joseph Justus Scaliger, although it appears the Jesuits would also be involved here.
45 The “Dream Argument” is a two-pronged proposition: The first prong is that dreams sometimes seem so real to us that there’s no way to know when we’re dreaming and when we are not. The second prong is that—in the same way we usually don’t recognize we’re dreaming until we begin to wake up—it’s possible that what currently appears to be regular day-to-day reality will disintegrate the moment we reach lucidity. In other words, you may think you’re reading a footnote right now, but maybe you’re just having a nonlucid dream where a footnote is being read. And as soon as you realize this, the page will start to dissolve.
46 These psychiatrists are referred to as Hobson-McCarley (John Allan Hobson and Robert McCarley), the Lennon-McCartney of not caring about dreams.
47 This belief is so pervasive that even those who believe otherwise feel obligated to concede its prevalence. “In Western society, most people don’t pay too much attention to their dreams,” said Deirdre Barrett, an assistant professor of psychology at Harvard Medical School. Barrett has studied dreaming for forty years.
48 This is mirrored by the growth of cognitive behavioral therapy, a model of psychoanalysis that suggests many thoughts are merely “automatic thoughts” that should not be taken as literal depictions of what we truly believe or desire. For example, just because you spontaneously imagine killing someone should not be taken as an indication that you secretly want to do this.
49 Dimethyltryptamine (usually referred to as DMT) can also be smoked recreationally. Manufactured DMT crystals are sprinkled atop marijuana buds and inhaled in one hit, generating a heavy, optical trip that lasts around ten minutes. Because the experience is so brief and
fleeting, DMT is sometimes called “the businessman’s hallucinogen.” It doesn’t demand a lot of free time. But in the same way that dream time is elastic, ten minutes on DMT can feel much, much longer.
50 I sometimes think I should have titled this book Aristotle: The Genius Who Was Wrong About Fucking Everything.
51 Some might question the espoused veracity of “the modern verification process,” on the basis of the publication of Stephen Glass’s imaginary exposés in The New Republic, Jayson Blair’s tenure at The New York Times, and the unreal University of Virginia rape account in Rolling Stone. But two things must be considered here. The first is that the process of fact-checking does have one unavoidable problem—there’s almost no way to verify a story that the writer has fabricated entirely, because you can’t disprove a negative. It’s unreasonable for a magazine fact-checker to start from the premise that the reporter concocted a story out of thin air, since only a psychopath would do so. It would be like a doctor initiating every medical examination by asking the patient if she’s lying about feeling sick. The second point is that all these stories were, eventually, proven to be false. It just took a little longer than we’d prefer.