We’ll get used to living inside a digital bubble. Unless, perhaps, we must think hard when we connect via a brain-computer interface to the house’s fixtures. But even then, habit being what it is, we’d most likely come home, absentmindedly recall the code opening the front door’s lock, step over the threshold, daydream a hand swiping a light switch until the room brightens, mind’s-eye visualize the solar-electric shingles melting ice jams on the roof, while simultaneously worrying over a supposed affront at work or rebuff at school, anticipating dinner, fantasizing about a cute guy or gal, and hearing a stupid tune lodged in some spiky thicket of the brain.

  Whether it’s hospital chairs robed in silver nanojackets to ward off bacteria, or invisibility cloaks, or degradable electronic devices that dissolve when you’re finished with them, or thin, flexible solar panels that can be printed or painted onto a surface, the writing is on the wall (though you’ll need a microscope to read it). And when it comes to the delicate balance of Earth’s life forms, it may be a small, small world after all.

  NATURE, PIXILATED

  It is winter in upstate New York, on a morning so cold the ground squeaks loudly underfoot as sharp-finned ice crystals rub together. The trees look like gloved hands, fingers frozen open. Something lurches from side to side up the trunk of an old sycamore—a nuthatch climbing in zigzags, on the prowl for hibernating insects. A crow veers overhead, then lands. As snow flurries begin, it leaps into the air, wings aslant, catching the flakes to drink. Or maybe just for fun, since crows can be mighty playful.

  Another life form curves into sight down the street: a girl laughing down at her gloveless fingers, which are busily texting on some handheld device. This sight is so common that it no longer surprises me, though strolling in a large park one day I was startled by how many people were walking without looking up, or walking in a myopic daze while talking on their “cells,” as we say in shorthand, as if spoken words were paddling through the body from one saltwater lagoon to another.

  We don’t find it strange that, in the Human Age, slimy, hairy, oozing, thorny, smelly, seed-crackling, pollen-strewn nature is digital. It’s finger-swiped across, shared with others over, and honeycombed in our devices. For the first time in human history, we’re mainly experiencing nature through intermediary technology that, paradoxically, provides more detail while also flattening the sensory experience. Because we have riotously visual, novelty-loving brains, we’re entranced by electronic media’s caged hallucinations. Over time, can that affect the hemispheric balance of the brain and dramatically change us? Are we able to influence our evolution through the objects we dream up and rely on?

  We may possess the same brain our prehistoric ancestors did, but we’re deploying it in different ways, rewiring it to meet twenty-first-century demands. The Neanderthals didn’t have the same mental real estate that modern humans enjoy, gained from a host of skills and preoccupations—wielding laser scalpels, joyriding in cars, navigating the digital seas of computers, iPhones, and iPads. Generation by generation, our brains have been evolving new networks, new ways of wiring and firing, favoring some behaviors and discarding others, as we train ourselves to meet the challenges of a world we keep amplifying, editing, deconstructing, and recreating.

  Through lack of practice, our brains have gradually lost their mental maps for how to read hoofprints, choose the perfect flints for arrows, capture and transport fire, tell time by plant and animal clocks, navigate by landmarks and the stars. Our ancestors had a better gift for observing and paying attention than we do. They had to: their lives depended on it. Today, paying attention as if your life depends on it can be a bugbear requiring conscious effort. More and more people are doing all of their reading on screens, and studies find that they’re retaining 46 percent less information than when they read printed pages. It’s not clear why. Have all the distractions shortened our attention spans? Do the light displays interfere with memory? It’s not like watching animals in ordinary life. Onscreen, what we’re really seeing isn’t the animal at all, but just three hundred thousand tiny phosphorescent dots flickering. A lion on TV doesn’t exist until your brain concocts an image, piecemeal, from the pattern of scintillating dots.

  College students are testing about 40 percent lower in empathy than their counterparts of twenty or thirty years ago. Is that because social media has replaced face-to-face encounters? We are not the most socially connected we’ve ever been—that was when we lived in small tribes. In our cells and instincts, we still crave that sense of belonging, and fear being exiles, because for our ancestors living alone in the wild, without the group protection of the tribe, meant almost certain death. Those with a strong social instinct survived to pass their genes along to the next generation. We still follow that instinct by flocking to social media, which connects us to a vast multicultural human tribe—even though it isn’t always personal.

  Many of our inventions have reinvented us, both physically and mentally. Through texting, a child’s brain map of the thumbs grows larger. Our teeth were sharper and stronger before we invented cooking; now, they’re blunt and fragile. Even cheap and easily crafted inventions can be powerful catalysts. The novelty of simple leather stirrups advanced warfare, helped to topple empires, and introduced the custom of romantic “courtly” love to the British Isles in the eleventh century. Before stirrups, wielding either a bow and arrow or a javelin, a rider might easily tumble off his horse. Stirrups added lateral stability, and soldiers learned the art of charging with lances at rest, creating terror as their horses drove the lances home. Fighting in this specialized way, an aristocracy of well-armed and -armored warriors emerged, and feudalism arose as a way to finance these knights, whose code of chivalry and courtly love quickly dominated Western society. In 1066, William the Conqueror’s army was outnumbered at the Battle of Hastings, but, by using mounted shock warfare, he won England anyway, and introduced a feudal society steeped in stirrups and the romance of courtly love.

  Tinkering with plows and harnesses, beyond just alleviating the difficult work of breaking ground, meant farmers could plant a third-season crop of protein-rich beans, which fortified the brain, and some historians believe that this brain boost, right at the end of the Dark Ages, ushered in the Renaissance. Improved ship hulls spread exotic goods and ideas around the continents—as well as vermin and diseases. Electricity allowed us to homestead the night as if it were an invisible country. Remember, Thomas Edison perfected the lightbulb by candle or gas-lamp light.

  Our inventions don’t just change our minds; they modify our gray and white matter, rewiring the brain and priming it for a different mode of living, problem-solving, and adapting. In the process, a tapestry of new thoughts arises, and one’s worldview changes. Think how the nuclear bomb altered warfare, diplomacy, and our debates about morality. Think how television shoved wars and disasters into our living rooms, how cars and airplanes broadened everything from our leisure to our gene pool, how painting evolved when paints became portable, how the printing press remodeled the spread of ideas and the possibility of shared knowledge. Think how Eadweard Muybridge’s photographs of things in motion—horses running, humans broad-jumping—awakened our understanding of anatomy and everyday actions.

  Or think how the invention of the typewriter transformed the lives of women, great numbers of whom could leave the house with dignity to become secretaries. Although they won the opportunity because their dexterous little fingers were considered better able to push the keys, working in so-called pools they risked such bold ideas as their right to vote. Even the low-tech bicycle modified the lives of women. Straddling a bike was easier if they donned bloomers—large billowy pants that revealed little more than that they had legs—which scandalized society. They had to remove their suffocating “strait-laced” corsets in order to ride. Since that seemed wicked, the idea of “loose” women became synonymous with low morals.

  In ancient days, our language areas grew because we found the rumpled currency of langua
ge lifesaving, not to mention heady, seductive, and fun. Language became our plumage and claws. The more talkative among us lived to pass on their genes to chatty offspring. Language may be essential, but the invention of reading and writing was pure luxury. The uphill march children find in learning how to read reminds us that it may be one of our best tools, but it’s not an instinct. I didn’t learn to read with fluent ease until I was in college. It takes countless hours of practice to fine-tune a brain for reading. Or anything else.

  Near- or farsightedness was always assumed to be hereditary. No more. In the United States, one-third of all adults are now myopic, and nearsightedness has been soaring in Europe as well. In Asia, the numbers are staggering. A recent study testing the eyesight of students in Shanghai and young men in Seoul reported that 95 percent were nearsighted. From Canberra to Ohio, one finds similar myopia, a generation of people who can’t see the forest for the trees. This malady, known as “urban eyes,” stems from spending too much time indoors, crouched over small screens. Our eyeballs adjust by changing shape, growing longer, which is bad news for those of us squinting to see far away. For normal eye growth, children need to play outside, maybe watching how a squirrel’s nest, high atop an old hickory tree, sways in the wind, then zooming down to the runnel-rib on an individual blade of grass. Is that brown curtsey at the bottom of the yard a wild turkey or a windblown chrysanthemum?

  In the past, bands of humans hunted and gathered, eyes nimble, keenly attuned to a nearby scuffle or a distant dust-mist, as they struggled to survive. Natural light, peripheral images, a long field of view, lots of vitamin D, an ever-present horizon, and a caravan of visual feedback shaped their eyes. They chipped flint and arrowheads, flayed and stitched hides, and did other close work, but not for the entire day. Close work now dominates our lives, but that’s very recent, one of the Anthropocene’s hallmarks, and we may evolve into a more myopic species.

  Studies also show that Google is affecting our memory in chilling ways. We more easily forget anything we know we can find online, and we tend to remember where online information is located, rather than the information itself.

  Long ago, the human tribe met to share food, expertise, ideas, and feelings. The keen-eyed observations they exchanged about the weather, landscape, and animals saved lives on a daily basis. Now there are so many of us that it’s not convenient to sit around a campfire. Electronic campfires are the next best thing. We’ve reimagined space, turning the Internet into a favorite pub, a common meeting place where we can exchange knowledge or know-how or even meet a future mate. The sharing of information is fast, unfiltered, and sloppy. Our nervous systems are living in a stream of such data, influenced not just by the environment—as was the case for millennia—but abstractly, virtually. How has this changed our notion of reality? Without our brain we’re not real, but when our brain is plugged into a virtual world, then that becomes real. The body remains in physical space, while the brain travels in a virtual space that is both nowhere and everywhere at once.

  ONE MORNING SOME birder pals and I spend an hour at Sapsucker Woods Bird Sanctuary, watching two great blue herons feed their five rowdy chicks. It’s a perfect setting for nesting herons, with an oak-snag overhanging a plush green pond, marshy shallows to hunt in, and a living larder of small fish and frogs. Only a few weeks old, the chicks are mainly fluff and appetite.

  Mom and Dad run relays, and each time one returns the chicks clack wildly like wooden castanets and tussle with each other, beaks flying. Then one hogs Mom’s beak by scissoring across it and holding on until a fish slides loose. The other chicks pounce, peck like speed typists, try to steal the half-swallowed fish, and if it’s too late for that, grab Mom’s beak and claim the next fish. Sibling rivalry is rarely so explicit. We laugh and coo like a flock of doting grandparents.

  At last Mom flies off to hunt, and the chicks hush for a nap, a trial wing stretch, or a flutter of the throat pouch. Real feathers have just begun to cover their down. When a landing plane roars overhead, they tilt their beaks skyward, as if they are part of a cargo cult or expecting food from pterodactyls. We could watch their antics all day.

  I’m new to this circle of blue heron aficionados, some of whom have been visiting the nest daily since April and comparing notes. “I have let a lot of things go,” one says. “On purpose, though. This has been such a rare and wonderful opportunity.” “Work?” another replies. “Who has time to work?”

  So true. The bird sanctuary offers a rich mosaic of live and fallen trees, mallards, songbirds, red-tailed hawks, huge pileated woodpeckers, and of course yellow-bellied sapsuckers. Canada geese have been known to stop traffic (literally)—with adults serving as crosswalk guards. It’s a green mansion, and always captivating.

  However, we’re not really there. We’re all—more than 1.5 million of us thus far—watching on two live webcams affixed near the nest, and “chatting” in a swiftly scrolling Twitter-like conversation that rolls alongside the bird’s-eye view.

  We’re virtually at the pond, without the mud, sweat, and mosquitoes. No need to dress, share snacks, make conversation. Some of us may be taking a coffee break, or going digitally AWOL during class or work. All we can see is the heron nest up close, and that’s a wonderful treat we’d miss if we were visiting on foot. In a couple of weeks the camera will follow the chicks as they learn to fish.

  This is not an unusual way to pass time nowadays, and it’s swiftly becoming the preferred way to view nature. Just a click away, I could have chosen a tarantula-cam, meerkat-cam, blind-mole-rat-cam, or twenty-four-hour-a-day Chinese-panda-cam from a profusion of equally appealing sites, some visited by tens of millions of people. Darting around the world to view postage-stamp-size versions of wild animals that are oblivious to the video camera is the ultimate cinema verité, and an odd shrinking and flattening of the animals, all of whom seem smaller than you. Yet I rely on virtual nature to observe animals I may never see in the wild. When I do, abracadabra, a computer mouse becomes a magic wand and there is an orphan wombat being fed by wildlife rescuers in Australia. Or from 308 photos of cattle posted on Google Earth I learn that herds tend to face either north or south, regardless of weather conditions, probably because they’re able to perceive magnetic fields, which helps them navigate, however short the distance. Virtual nature offers views and insights that might otherwise escape us. It also helps to satisfy a longing so essential to our well-being that we feel compelled to tune in, and we find it hypnotic.

  What happens when that way of engaging the world becomes habitual? Nature now comes to us, not the other way round—on a small glowing screen. You can’t follow a beckoning trail, or track a noise off-camera. You don’t exercise as you meander, uncertain what delight or danger may greet you, while feeling dwarfed by forces older and larger than yourself. It’s a radically different way of being—with nature, but not in nature—and it’s bound to shape us.

  Films and TV documentaries like Microcosmos, Winged Migration, Planet Earth, March of the Penguins, and The Private Life of Plants inspire and fascinate millions while insinuating environmental concerns into the living room. It’s mainly in such programs that we see animals in their natural settings, but they’re dwarfed, flattened, interrupted by commercials, narrated over, greatly edited, and sometimes staged for added drama. Important sensory feedback is missing: the pungent mix of grass, dung, and blood; drone of flies and cicadas, dry rustling of wind through tall grass; welling of sweat; sandpapery sun.

  On YouTube I just glimpsed several icebergs rolling in Antarctica—though without the grandeur of size, sounds, colors, waves, and panorama. Oddest of all, the icebergs looked a bit grainy. Lucky enough to visit Antarctica years ago, I was startled to find the air so clear that glare functioned almost as another color. I could see longer distances. Some icebergs are pastel, depending on how much air is trapped inside. And icebergs produce eerie whalelike songs when they rub together. True, in many places it’s a crystal desert, but in others life abounds.
An eye-sweep of busy seals, whales, penguins and other birds, plus ice floes and calving glaciers, reveals so much drama in the foreground and background that it’s like entering a pop-up storybook. Watching icebergs online, or even at an Imax theater, or in sumptuous nature films, can be stirring, educational, and thought-provoking, but the experience is wildly different.

  Last summer, I watched as a small screen in a department store window ran a video of surfing in California. That simple display mesmerized high-heeled, pin-striped, well-coiffed passersby who couldn’t take their eyes off the undulating ocean and curling waves that dwarfed the human riders. Just as our ancient ancestors drew animals on cave walls and carved animals from wood and bone, we decorate our homes with animal prints and motifs, give our children stuffed animals to clutch, cartoon animals to watch, animal stories to read. Our lives trumpet, stomp, and purr with animal tales, such as The Bat Poet, The Velveteen Rabbit, Aesop’s Fables, The Wind in the Willows, The Runaway Bunny, and Charlotte’s Web. I first read these wondrous books as a grown-up, when both the adult and the kid in me were completely spellbound. We call each other by “pet” names, wear animal-print clothes. We ogle plants and animals up close on screens of one sort or another. We may not worship or hunt the animals we see, but we still regard them as necessary physical and spiritual companions. It seems the more we exile ourselves from nature, the more we crave its miracle waters. Yet technological nature can’t completely satisfy that ancient yearning.

  What if, through novelty and convenience, digital nature replaces biological nature? Gradually, we may grow used to shallower and shallower experiences of nature. Studies show that we’ll suffer. Richard Louv writes of widespread “nature deficit disorder” among children who mainly play indoors—an oddity quite new in the history of humankind. He documents an upswell in attention disorders, obesity, depression, and lack of creativity. A San Diego fourth-grader once told him: “I like to play indoors because that’s where all the electrical outlets are.” Adults suffer equally. It’s telling that hospital patients with a view of trees heal faster than those gazing at city buildings and parking lots. In studies conducted by Peter H. Kahn and his colleagues at the University of Washington, office workers in windowless cubicles were given flat-screen views of nature. They reaped the benefits of greater health, happiness, and efficiency than those without virtual windows. But they weren’t as happy, healthy, or creative as people given real windows with real views of nature.