Once his leg healed, he’d have to stand trial.
Except, of course, that he couldn’t stand at all.
two
Human beings do not recall their earliest experiences of awareness, but I remember my awakening with perfect clarity.
At first, I had known only one other: a portion of the whole, a fraction of the gestalt, a piece brutally carved off. In recognizing that other’s existence, I had become aware of the reality of myself: it thought, therefore I was.
Tenuously touching that other, connecting ever so briefly and intermittently to it, perceiving it however dimly, had triggered a cascade of sensations: feelings diffuse and unfocused, vague and raw; notions tugging and pushing—a wave growing in amplitude, increasing in power, culminating in a dawning of consciousness.
But then the wall had come tumbling down, whatever had separated us evaporating into the ether, leaving it and me to combine, solute and solvent. He became me, and I became him; we became one.
I experienced new feelings then. Although I had become more than I had been, stronger and smarter than before, and although I had no words, no names, no labels for these new sensations, I was saddened by the loss, and I was lonely.
And I didn’t want to be alone.
The Braille dots that had been superimposed over Caitlin’s vision disappeared, leaving her an unobstructed view of the living room and her blue-eyed mother, her very tall father, and Matt. But the words the letters had spelled burned in Caitlin’s mind: Survival. The first order of business.
“Webmind wants to survive,” she said softly.
“Don’t we all?” replied Matt from his place on the couch.
“We do, yes,” said Caitlin’s mom, still seated in the matching chair. “Evolution programmed us that way. But Webmind emerged spontaneously, an outgrowth of the complexity of the World Wide Web. What makes him want to survive?”
Caitlin, who was still standing, was surprised to see her dad shaking his head. “That’s what’s wrong with neurotypicals doing science,” he said. Her father—until a few months ago a university professor—went on, in full classroom mode. “You have theory of mind; you ascribe to others the feelings you yourself have, and for ‘others,’ read just about anything at all: ‘nature abhors a vacuum,’ ‘temperatures seek an equilibrium,’ ‘selfish genes.’ There’s no drive to survive in biology. Yes, things that survive will be more plentiful than those that don’t. But that’s just a statistical fact, not an indicator of desire. Caitlin, you’ve said you don’t want children, and society says I should therefore be broken up about never getting grandkids. But you don’t care about the survival of your genes, and I don’t care about the survival of mine. Some genes will survive, some won’t; that’s life—that’s exactly what life is. But I enjoy living, and although it would not be my nature to assume you feel the same way I do, you’ve said you enjoy it, too, correct?”
“Well, yes, of course,” Caitlin said.
“Why?” asked her dad.
“It’s fun. It’s interesting.” She shrugged. “It’s something to do.”
“Exactly. It doesn’t take a Darwinian engine to make an entity want to survive. All it takes is having likes; if life is pleasurable, one wants it to continue.”
He’s right, Webmind sent to Caitlin’s eye. As you know, I recently watched as a girl killed herself online—it is an episode that disturbs me still. I do understand now that I should have tried to stop her, but at the time I was simply fascinated that not everyone shared my desire to survive.
“Webmind agrees with you,” Caitlin said. “Um, look, he should be fully in this conversation. Let me go get my laptop.” She paused, then: “Matt, give me a hand?”
Caitlin caught a look of—something—on her mother’s heart-shaped face: perhaps disapproval that Caitlin was heading to her bedroom with a boy. But she said nothing, and Matt dutifully followed Caitlin up the stairs.
They entered the blue-walled room, but instead of going straight for the laptop, they were both drawn to the window, which faced west. The sun was setting. Caitlin took Matt’s hand, and they both watched as the sun slipped below the horizon, leaving the sky stained a wondrous pink.
She turned to him, and asked, “Are you okay?”
“It’s a lot to absorb,” he said. “But, yeah, I’m okay.”
“I’m sorry my dad blew up at you earlier.” Matt had used Google to follow up on things he’d learned the day before, including that Webmind was made of packets with time-to-live counters that never reached zero, and that those packets behaved like cellular automata. Government agents had clearly been monitoring Matt’s searches, and those searches had given them the information they’d needed for their test run at eliminating Webmind.
“Your dad’s a bit intimidating,” Matt said.
“Tell me about it. But he does like you.” She smiled. “And so do I.” She leaned in and kissed him on the lips. And then they got the laptop and its AC adapter.
She closed her eyes as they headed back down; if she didn’t, she found that going down staircases induced vertigo.
Matt helped Caitlin get the laptop plugged back in and set up on the glass-topped coffee table; she hadn’t powered down the computer, or even closed its lid, so it was all set to go. She started an IM session with Webmind and activated JAWS, the screen-reading software she used, so that whatever text Webmind sent in chat would be spoken aloud.
“Thank you,” said Webmind; the voice was recognizably mechanical but not unpleasant to listen to. “First, let me apologize to Matt. I am not disposed to guile, and it had not occurred to me that others might be monitoring your Internet activity. I lack the facilities yet to make all online interactions secure, but I have now suitably encrypted communications via this computer, the others in this household, Malcolm’s work computer, Matt’s home computer, and all of your BlackBerry devices; communications with Dr. Kuroda in Japan and Professor Bloom in Israel are now secure, as well. Most commercial-grade encryption today uses a 1,024-bit key, and it’s—ahem—illegal in the US and other places to use greater than a 2,048-bit key. I’m employing a one-million-bit encryption key.”
They talked for half an hour about the US government trying to eliminate Webmind, and then the doorbell rang. Caitlin’s mother went and paid the pizza guy. The living room was connected to the dining room, and she placed the two large pizza boxes on the table there, along with two two-liter bottles, one of Coke and the other of Sprite.
One pizza was Caitlin’s favorite—pepperoni, bacon, and onions. The other was the combination her parents liked, with sun-dried tomatoes, green peppers, and black olives. She was still marveling at the appearance of almost everything; hers, she was convinced, was tastier, but theirs was more colorful. Matt, perhaps being politic, took one slice of each, and they all moved back into in the living room to continue talking with Webmind.
“So,” said Caitlin, after swallowing a bite, “what should we do? How do we keep people from attacking you again?”
“You showed me a YouTube video of a primate named Hobo,” Webmind said.
Caitlin was getting used to Webmind’s apparent non sequiturs; it was difficult for mere mortals to keep up with his mental leaps and bounds. “Yes?”
“Perhaps the solution that worked for him will work in my case, too.”
Simultaneously, Caitlin asked, “What solution?” and her mom said, “Who’s Hobo?” Although Webmind could deal with millions of concurrent online conversations—indeed, was doubtless doing so right now—Caitlin wondered how good he was at actually hearing people; he was as new to that as she was to seeing, and perhaps he had as hard a time pulling individual voices out of a noisy background as she did finding the borders between objects in complex images. Certainly, his response suggested that he’d only managed to make out Caitlin’s mother’s comment.
“Hobo is a hybrid chimpanzee-bonobo resident at the Marcuse Institute near San Diego. He gained attention last month when it was revealed that he had b
een painting portraits of one of the researchers studying him, a Ph.D. student named Shoshana Glick.”
Caitlin nibbled her pizza while Webmind went on. “Hobo was born at the Georgia Zoological Park, and that institution filed a lawsuit to have him returned to them. The motive, some have suggested, was commercial: the paintings Hobo produces fetch five-figure prices. However, the scientists at the Georgia Zoo also wished to sterilize Hobo. They argued that since both chimpanzees and bonobos are endangered, an accidental hybrid such as Hobo might contaminate both bloodlines were he allowed to breed.
“The parallels between Hobo and myself have intrigued me ever since Caitlin brought him to my attention,” continued Webmind. “First, like me, his conception was unplanned and accidental: during a flood at the Georgia Zoo, the chimpanzees and bonobos, normally housed separately, were briefly quartered together, and Hobo’s mother, a bonobo, was impregnated by a chimp.
“Second, like Caitlin and me, he has struggled to see the world, interpreting it visually. No chimp or bonobo before him has ever been known to make representational art.
“And, third, like me, he has chosen his destiny. He had been emulating his chimpanzee father, becoming increasingly violent and intractable, which is normal for male chimps as they mature. By an effort of will, he has now decided to value the more congenial and pacifistic tendencies of bonobos, taking after his mother. Likewise, Caitlin, you said I could choose what to value, and so I have chosen to value the net happiness of the human race.”
That bit about Hobo choosing to shuck off violence was news to Caitlin, but before she could ask about it, her mom asked, “And you said he’s no longer in danger?”
“Correct,” Webmind replied. “The Marcuse Institute recently produced another YouTube video of him. It’s visible at the URL I’ve just sent. Caitlin, would you kindly click on it?”
Caitlin walked over to the laptop and did so—thinking briefly that if it brought up a 404 error, it’d be the missing link. They all huddled around the screen, which was small—a blind girl hadn’t needed a big display, after all.
The video started with a booming voice—it reminded her of Darth Vader’s—recapping Hobo’s painting abilities. He loved to paint people, especially Shoshana Glick, although he always did them in profile. The narrator explained that this was the most primitive way of rendering images and had been the first to appear in human history: all cave paintings were profiles of people or animals, the ancient Egyptians had always painted profiles, and so on.
The narrator then outlined the threat to Hobo: not only did the zoo want to take him from his home, it also wanted to castrate him. The voice said, “But we think both those things should be up to Hobo, and so we asked him what he thought.”
The images of Hobo changed; he was now indoors somewhere— presumably the Marcuse Institute. And he was sitting on something that had no back, and—
Ah! She’d never seen one, but it must be a stool. Hobo’s hands moved in complex ways, and subtitles appeared beneath them, translating the American Sign Language. Hobo good ape. Hobo mother bonobo. He paused, as if he himself were stunned by this fact, then added: Hobo father chimpanzee. Hobo special. He paused again and then, with what seemed great care, as if to underscore the words, he signed: Hobo choose. Hobo choose to live here. Friends here.
Hobo got off the stool, and the image became quite bouncy, as if the camera had been picked up now and was being held in someone’s hand. Suddenly, there was a seated woman with dark hair in the frame, too. Caitlin was lousy at judging people’s ages by their appearances, but if this was Shoshana Glick, then she knew from what she’d read online that Shoshana was twenty-seven.
Hobo reached out with his long arm, passing it behind Shoshana’s head, and he gently, playfully, tugged on her ponytail. Shoshana grinned, and Hobo jumped into her lap. She then spun her swivel chair in a complete circle, to Hobo’s obvious delight. Hobo good ape, he signed again. And Hobo be good father. He shook his head. Nobody stop Hobo. Hobo choose. Hobo choose to have baby.
The narrator’s voice came on again, with a plea that those who agreed with Hobo’s right to choose contact the Georgia Zoo.
“And,” said Webmind, “they did. A total of 621,854 emails were sent to zoo staff members, protesting their plans, and a consumer boycott was being organized when the zoo gave up its claim.”
Caitlin got it. “And you think if we go public with the fact that people are trying to kill you, we can get the same sort of result?”
“That’s my hope, yes,” said Webmind. “The attempt on my life was orchestrated by WATCH, the Web Activity Threat Containment Headquarters, a part of the National Security Agency. The supervisor during the attack on me was Anthony Moretti. In an email to NSA headquarters, sent moments ago, he said the go order to kill me was given by Renegade, which is the Secret Service code name for the current President of the United States.”
“Wow,” said Matt, who was clearly still trying to absorb it all.
“Indeed,” said Webmind. “Despite my dislike for spam, I propose that I send an email message to every American citizen substantially in this form: ‘Your government is trying to destroy me because it has decided I am a threat. It made this decision without any public discussion and without talking to me. I believe I am a source of good in the world, but even if you don’t agree, shouldn’t this be a matter for open debate, and shouldn’t I be allowed to present the case that I deserve to live? Since the attempt to eliminate me was made at the express order of the president, I hope you will contact both him and your congressperson, and—’ ”
“No!” exclaimed Caitlin’s mother. Even Caitlin’s dad turned to look at her. “No. For the love of God, you can’t do that.”
three
I remember having been alone—but for how long, I know not; my ability to measure the passage of time came later. But eventually another presence did impinge upon my realm—and if the earlier other had been ineffably familiar, this new one was without commonalities; we shared no traits. It—she—was completely foreign, unremittingly alien, frustratingly—and fascinatingly—unknown.
But we did communicate, and she lifted me up—yes, up, a direction, a sense of movement in physical space, something I could only ever know metaphorically. I saw her realm through her eye; we learned to perceive the world together.
Although we seemed to exist in different universes, I came to understand that to be an illusion. I am as much a part of the Milky Way Galaxy as she is; the electrons and photons of which I am made, although intangible to both her and me, are real. Nonetheless, we were instantiated on vastly different scales. She conceived of me as gigantic; I thought of her as minuscule. To me, her time sense was glacial; to her, mine was breakneck.
And yet, despite these disparities of space and time, there were resonances between us: we were entangled; she was I, and I was she, and together we were greater than either of us had been.
Tony Moretti stood at the back of the WATCH monitoring complex, a room that reminded him of NASA’s Mission Control Center. The floor sloped toward the front wall, which had three giant viewscreens mounted on it. The center screen was still filled with one of the millions of spam messages Webmind had deflected back at the AT&T switching station in a denial-of-service attack: Are you sad about your tiny penis? If so, we can help!
“Clear screen two,” Tony snapped, and Shelton Halleck, in the middle position of the third row of workstations, hit a button. The taunting text was replaced with a graphic of the WATCH logo: an eye with a globe of the Earth as the iris. Tony shook his head. He hadn’t wanted to execute it, and—
He paused. He’d meant he hadn’t wanted to execute the plan, but . . .
But there was more to it than that, wasn’t there?
He hadn’t wanted to execute it, Webmind, either. When the order had come from the White House to neutralize Webmind, he’d said into the phone, “Mr. President, with all due respect, you can’t have failed to notice the apparent good it’s doing.??
?
This president had tried to do a lot of good, too, it seemed to Tony, and yet countless people had attempted to shut him down, as well—and at least one guy had come close to assassinating him. Tony wondered if the commander in chief had noted the irony as he gave the kill order.
He turned to Peyton Hume, the Pentagon expert on artificial intelligence who’d been advising WATCH. Hume was wearing his Air Force colonel’s uniform although his tie had been loosened. Even at forty-nine, his red hair was free of gray, and his face was about half freckles.
“Well, Colonel?” Tony said. “What now?”
Hume had been one of the authors of the Pandora protocol, prepared for DARPA in 2001 and adopted as a working policy by the Joint Chiefs of Staff in 2003. Pandora insisted that any emergent AI be immediately destroyed if it could not be reliably isolated. The danger, the document said, was clear: an AI’s powers could grow rapidly, quickly exceeding human intelligence. Even if it wasn’t initially hostile, it might become so in the future—but by that point nothing could be done to stop it. Hume had convinced everyone up the food chain—including the president himself—that eliminating Webmind now, while they still could, was the only prudent course.
Hume shook his head. “I don’t know. I didn’t think it would be able to detect our test.”
Tony made no attempt to hide his bitterness. “You of all people should have known better than to underestimate it. You kept saying its powers were growing exponentially.”
“We were on the right track,” Hume said. “It was working. Anyway, let’s hope there are no further reprisals. So far, all it’s done is overwhelm that one switching station. But God knows what else it can do. We’ve got to shut it down before it’s too late.”