The finite is nested within the infinite, and somewhere across the glittering, howling universal sample space of Buddha Field or Babel, your doppelgänger is hard at the keyboard, playing a Bach toccata.
ROBERT M. SAPOLSKY
Super Humanity
FROM Scientific American
SIT DOWN WITH an anthropologist to talk about the nature of humans, and you are likely to hear this chestnut: “Well, you have to remember that 99 percent of human history was spent on the open savanna in small hunter-gatherer bands.” It’s a classic cliché of science, and it’s true. Indeed, those millions of ancestral years produced many of our hallmark traits—upright walking and big brains, for instance. Of course, those wildly useful evolutionary innovations came at a price: achy backs from our bipedal stance; existential despair from our large, self-contemplative cerebral cortex. As is so often the case with evolution, there is no free lunch.
Compounding the challenges of those trade-offs, the world we have invented—and quite recently in the grand scheme of things—is dramatically different from the one to which our bodies and minds are adapted. Have your dinner come to you (thanks to the pizza delivery guy) instead of chasing it down on foot; log in to Facebook to interact with your nearest and dearest instead of spending the better part of every day with them for your whole life. But this is where the utility of the anthropologist’s cliché for explaining the human condition ends.
The reason for this mismatch between the setting we evolved to live in and the situations we encounter in our modern era derives from another defining characteristic of our kind, arguably the most important one: our impulse to push beyond the limitations evolution imposed on us by developing tools to make us faster, smarter, longer-lived. Science is one such tool—an invention that requires us to break out of our Stone Age seeing-is-believing mindset so that we can clear the next hurdle we encounter, be it a pandemic flu or climate change. You could call it the ultimate expression of humanity’s singular drive to aspire to be better than we are.
Human Oddities
To understand how natural selection molded us into the unique primates we have become, let us return to the ancestral savanna. That open terrain differed considerably from the woodlands our ape forebears called home. For one thing, the savanna sun blazed hotter; for another, nutritious plant foods were scarcer. In response, our predecessors lost their thick body hair to keep cool. And their molars dwindled as they abandoned a tough vegetarian diet for one focused in part on meat from grassland grazers—so much so that our molars are now nearly useless, with barely any grinding surface.
Meanwhile the selective demands of food scarcities sculpted our distant forebears into having a body that was extremely thrifty and good at storing calories. Now, having inherited that same metabolism, we hunt and gather Big Macs as diabetes becomes a worldwide scourge. Or consider how our immune systems evolved in a world where one hardly ever encountered someone carrying a novel pathogen. Today, if you sneeze near someone in an airport, your rhinovirus could be set free twelve time zones away by the next day.
Our human oddities abound where behavior is concerned. By primate standards, we are neither fish nor fowl in lots of ways. One example is particularly interesting. Primate species generally fall into two distinct types: on one hand, there are pair-bonding species, in which females and males form stable, long-lasting pairs that practice social and sexual monogamy. Monogamous males do some or even most of the caring for the young, and females and males in these species are roughly the same size and look very similar. Gibbons and numerous South American monkeys show this pattern. “Tournament” species take the opposite tack: females do all the child care, whereas males are far larger and come with all kinds of flashy displays of peacockery—namely, gaudy, conspicuous facial coloration and silver backs. These tournament males spend a ridiculous percentage of their time enmeshed in aggressive posturing. And then there are humans, who, by every anatomical, physiological, and even genetic measure, are neither classic pair-bonding nor tournament creatures and instead lie stuck and confused somewhere in the middle.
Yet in another behavioral regard, humans are textbook primates: we are intensely social, and our fanciest types of intelligence are the social kinds. We primates may have circumstances where a complex mathematical instance of transitivity bewilders us, but it is simple for us to figure out that if person A dominates B, and B dominates C, then C had better grovel and submissively stick his butt up in the air when A shows up. We can follow extraordinarily complex scenarios of social interaction and figure out if a social contract has been violated (and are better at detecting someone cheating than someone being overly generous). And we are peerless when it comes to facial recognition: we even have an area of the cortex in the fusiform gyrus that specializes in this activity.
The selective advantages of evolving a highly social brain are obvious. It paved the way for us to fine-tune our capacities for reading one another’s mental states, to excel at social manipulation, and to adeptly deceive and attract potential mates and supporters. Among Americans, the extent of social intelligence in youth is a better predictor of our adult success in the occupational world than are SAT scores.
Indeed, when it comes to social intelligence in primates, humans reign supreme. The social-brain hypothesis of primate evolution is built on the fact that across primate species, the percentage of the brain devoted to the neocortex correlates with the average size of the social group of that species. This correlation is more dramatic in humans (using the group sizes found in traditional societies) than in any other primate species. In other words, the most distinctively primate part of the human brain coevolved with the demands of keeping track of who is not getting along with whom, who is tanking in the dominance hierarchy, and what couple is furtively messing around when they should not be.
Like our bodies, our brains and behaviors, sculpted in our distant hunter-gatherer past, must also accommodate a very different present. We can live thousands of miles away from where we were born. We can kill someone without ever seeing his face. We encounter more people standing on line for Space Mountain at Disneyland than our ancestors encountered in a lifetime. My God, we can even look at a picture of someone and feel lust despite not knowing what that person smells like—how weird is that for a mammal?
Beyond Limits
The fact that we have created and are thriving in this unrecognizable world proves a point—namely, that it is in our nature to be unconstrained by our nature. We are no strangers to going out of bounds. Science is one of the strangest, newest domains where we challenge our hominid limits. Some of the most dramatic ways in which our world has been transformed are the direct products of science, and the challenges there are obvious. Just consider those proto-geneticists who managed to domesticate some plants and animals—an invention that brought revolutionary gains in food but that now threatens to strip the planet of its natural resources.
On a more abstract plane, science tests our sense of what is the norm, what counts as better than well. It challenges our sense of who we are. Thanks to science, human life expectancy keeps extending, our average height increases, our scores on standardized tests of intelligence improve. Thanks to science, every world record for a sporting event is eventually surpassed.
As science pushes the boundaries in these domains, what is surprising is how little these changes have changed us. No matter how long we can expect to live, we still must die, there will still be a leading cause of death, and we will still feel that our loved ones were taken from us too soon. And when it comes to humans becoming, on average, smarter, taller, and better at athletics, there is a problem: Who cares about the average? As individuals, we want to individually be better than other individuals. Our brain is invidious, comparative, more interested in contrasts than absolutes. That state begins with sensory systems that often do not tell us about the quality of a stimulus but instead about the quality relative to the stimuli that surround it. For example, the retina conta
ins cells that do not so much respond to a color as to a color in the context of its proximity to its “opposite” color (red versus green, for instance). Although we may all want to be smart, we mostly want to be smarter than our neighbor. The same is true for athletes, which raises a question that has long been pertinent to hominids: How fast do you have to run to evade a lion? And the answer always is: faster than the person next to you.
Still, science most asks us to push our limits when it comes to the kinds of questions we ask. I see four particular types. The first has to do with the frequently asocial nature of science. By this I am not referring to the solitary task of some types of scientific inquiry, the scientist slaving away alone at three in the morning. I mean that science often asks us to be really interested in inanimate things. There are obviously plenty of exceptions to this rule—primatologists sit around and gossip at night about the foibles and peccadilloes of their monkeys; the paleontologist Louis Leakey used to refer to his favorite fossil skull as “Dear Boy.” Yet some realms of science consider extremely inanimate issues—astrophysicists trying to discover planets in other solar systems, for instance. Science often requires our social, hominid brain to be passionate about some pretty unlikely subjects.
Science pushes our envelope in a second way when we contemplate the likes of quantum mechanics, nanotechnology, and particle physics, which ask us to believe in things that we cannot see. I spent my graduate-school years pipetting fluids from one test tube to another, measuring levels of things like hormones and neurotransmitters. If I had stopped and thought about it, it would have seemed completely implausible that there actually are such things as hormones and neurotransmitters. That implausibility is the reason why so many of us lab scientists who measure or clone or inject invisible things get the most excited when we get to play with dry ice.
Science, by the nature of the questions it can generate, can push the bounds of our hominid credulity in a third way. We are unmatched in the animal kingdom when it comes to remembering the distant past, when it comes to having a sense of the future. These skills have limits, however. Traditionally our hunter-gatherer forebears may have remembered something their grandmother was told by her grandmother, or they may have imagined the course of a generation or two that would outlive them. But science sometimes asks us to ponder processes that emerge over time spans without precedent. When will the next ice age come? Will Gondwana ever reunite? Will cockroaches rule us in a million years?
Everything about our hominid minds argues against the idea that there are processes that take that long or that such processes could be interesting. We and other primates are creatures of steep temporal discounting—getting ten dollars or ten pellets of monkey chow right now is more appealing than waiting until tomorrow for eleven, and the dopamine-reward pathways in our brain light up on brain-imaging tests when we go for the impulsive immediate reward. It seems most of us would rather have half a piece of stale popcorn next week than wait 1,000 years to win a bet about a key hypothesis in plate tectonics.
Then there are scientific questions that stretch our limits in the most profound ways. These are quandaries of dazzling abstractness: Does free will exist? How does consciousness work? Are there things that are impossible to know?
It is tempting to fall for an easy insight here, which is that our Paleolithic minds give up on challenges like these and just turf them to the gods to contemplate. The problem is the human propensity toward creating gods in our own image (one fascinating example being that autistic individuals who are religious often have an image of an asocial god, one who is primarily concerned with the likes of keeping atoms from flying apart). Throughout the history of humans inventing deities, few of these gods had a gargantuan capacity for the abstract. Instead they had familiar appetites. No traditional deities would be particularly interested in chewing the fat with Gödel about knowingness or rolling dice with Einstein (or not rolling the dice, as it were). They would be much more into having the biggest ox sacrificed to them and scoring with the most forest nymphs.
The very scientific process defies our basic hominid limits. It asks us to care intensely about tiny, even invisible, things, things that do not breathe or move, things vast distances away from us in space and time. It encourages us to care about subjects that would bore the crap out of Thor or Baal. It is one of the most challenging things that we have come up with. No wonder all those nerd-detector alarms would go off back in middle school when we were spotted reading a magazine like Scientific American. This venture of doing, thinking, caring about science is not for the faint-hearted—we are far better adapted to face saber-toothed cats—and yet here we are, reinventing the world and striving to improve our lot in life one scientific question at a time. It’s our human nature.
KATHERINE HARMON
The Patient Scientist
FROM Scientific American
PEERING THROUGH A microscope at a plate of cells one day, Ralph M. Steinman spied something no one had ever seen before. It was the early 1970s, and he was a researcher at Rockefeller University on Manhattan’s Upper East Side. At the time, scientists were still piecing together the basic building blocks of the immune system. They had figured out that there are B cells, white blood cells that help to identify foreign invaders, and T cells, another type of white blood cell that attacks those invaders. What puzzled them, however, was what triggered those T cells and B cells to go to work in the first place. Steinman glimpsed what he thought might be the missing piece: strange, spindly-armed cells unlike any he had ever noticed.
His intuition turned out to be correct. These dendritic cells, as Steinman named them, are now thought to play a crucial role in detecting invaders in the body and initiating an immune response against them. They snag interlopers with their arms, ingest them, and carry them back to other types of immune cells—in effect, “teaching” them what to attack. It was a landmark discovery that explained in unprecedented detail how vaccines worked, and it propelled Steinman into the top tiers of his profession.
In many ways, Steinman’s story is typical: brilliant scientist makes major discovery that inspires a new generation of researchers. Indeed, his insight was remarkable for its implications, both for science and for him personally.
Over the years Steinman came to believe that dendritic cells were a crucial weapon for tackling some of the most loathed diseases, from cancer to HIV. He and his global network of colleagues seemed to be well on the way to proving him correct when Steinman’s story took an unusual turn.
In 2007 he was diagnosed with pancreatic cancer, an unforgiving disease that kills four out of five patients within a year. In the end, the cells he discovered at the start of his career, and the friends he made along the way, would not only help him fight his cancer but would extend his life just long enough for him to earn the Nobel Prize. He died this past September, three days before a flashing light on his cell phone alerted his family that he had won.
A Prepared Mind
Steinman did not encounter serious biology until he arrived as a student at McGill University. As soon as he did, though, he was hooked, and it was his fascination with the minuscule world of the immune cell that would bring him to the lab of Zanvil A. Cohn at Rockefeller. In his office Steinman would later display a quote from the famous nineteenth-century microbiologist and vaccinologist Louis Pasteur: Le hazard ne favorise que les esprits préparés, which is often translated as “Chance favors the prepared mind.” Says Sarah Schlesinger, a longtime colleague and friend of Steinman’s, “Ralph was exceedingly well prepared, so he was poised to make a discovery. But with that said, he intuited that these were important,” she says of the cells. It was that intuition and a confidence in observation that enabled him to make his seminal discovery—and eventually win the admiration of colleagues.
After he first spotted dendritic cells, Steinman spent the next two decades convincing the scientific community of their significance, defining how they worked and how researchers could work with them. “He
fought—there’s really no other word for it—to convince people that they were a distinct entity,” says Schlesinger, who came to work at Steinman’s lab in 1977, when she was still in high school. Even then, she says, people in the same lab were not convinced that these dendritic cells existed because they were difficult to enrich into larger batches. At the time, Steinman was still working at the bench, and Schlesinger recalls sitting with him at a two-headed microscope, examining the cells. “He just loved to look at them,” she says, smiling at the memory. “There was such a joy in all of the little discoveries that he made.”
By the 1980s Steinman, who had trained as a physician, started to look for ways his dendritic cell discovery could be applied more directly to help people. Over the next few decades, as the cells became more widely accepted, his lab expanded its focus to include research into dendritic cell–based vaccines for HIV and tuberculosis, as well as research into cancer treatment. For illnesses such as influenza or smallpox that could already be prevented with vaccines, those who survive natural exposure may develop a lifelong immunity. HIV, TB, and cancer presented a greater challenge because they seemed to be better at overcoming the immune system—even, in the case of HIV, hijacking dendritic cells to do its dirty work. “Ralph would say, ‘We have to be smarter than nature,’” Schlesinger says. That meant helping the dendritic cells by giving them more targeted information about the virus or tumor against which the immune system needed to form an attack.
In the 1990s, working with Madhav Dhodapkar, now at Yale University, and Nina Bhardwaj, now at New York University, Steinman created a process for extracting dendritic cells from the blood and priming them with antigens—telltale protein fragments—from infections, such as influenza and tetanus, and then placing them back in the body to create a stronger immunity. This technology served as the basis for a prostate cancer vaccine called Provenge that was approved in 2010 and has been shown to extend the life of terminally ill patients—if only by a few months.