Page 19 of Oceanic


  “No.” The damage to her uterus from the miscarriage could be repaired; we’d been discussing the possibility for almost five years. We’d both had comprehensive chelation therapy to remove any trace of U-238. We could have children in the usual way with a reasonable degree of safety, if that was what we wanted. “But if you’ve already decided, I want you to tell me now.”

  Francine looked wounded. “That’s unfair.”

  “What is? Implying that you might not have told me, the instant you decided?”

  “No. Implying that it’s all in my hands.”

  I said, “I’m not washing my hands of the decision. You know how I feel. But you know I’d back you all the way, if you said you wanted to carry a child.” I believed I would have. Maybe it was a form of doublethink, but I couldn’t treat the birth of one more ordinary child as some kind of atrocity, and refuse to be a part of it.

  “Fine. But what will you do if I don’t?” She examined my face calmly. I think she already knew, but she wanted me to spell it out.

  “We could always adopt,” I observed casually.

  “Yes, we could do that.” She smiled slightly; she knew that made me lose my ability to bluff, even faster than when she stared me down.

  I stopped pretending that there was any mystery left; she’d seen right through me from the start. I said, “I just don’t want to do this, then discover that it makes you feel that you’ve been cheated out of what you really wanted.”

  “It wouldn’t,” she insisted. “It wouldn’t rule out anything. We could still have a natural child as well.”

  “Not as easily.” This would not be like merely having workaholic parents, or an ordinary brother or sister to compete with for attention.

  “You only want to do this if I can promise you that it’s the only child we’d ever have?” Francine shook her head. “I’m not going to promise that. I don’t intend having the operation any time soon, but I’m not going to swear that I won’t change my mind. Nor am I going to swear that if we do this it will make no difference to what happens later. It will be a factor. How could it not be? But it won’t be enough to rule anything in or out.”

  I looked away, across the rows of tables, at all the students wrapped up in their own concerns. She was right; I was being unreasonable. I’d wanted this to be a choice with no possible downside, a way of making the best of our situation, but no one could guarantee that. It would be a gamble, like everything else.

  I turned back to Francine.

  “All right; I’ll stop trying to pin you down. What I want to do right now is go ahead and build the Qusp. And when it’s finished, if we’re certain we can trust it … I want us to raise a child with it. I want us to raise an AI.”

  2029

  I met Francine at the airport, and we drove across São Paulo through curtains of wild, lashing rain. I was amazed that her plane hadn’t been diverted; a tropical storm had just hit the coast, halfway between us and Rio.

  “So much for giving you a tour of the city,” I lamented. Through the windscreen, our actual surroundings were all but invisible; the bright overlay we both perceived, surreally colored and detailed, made the experience rather like perusing a 3D map while trapped in a car wash.

  Francine was pensive, or tired from the flight. I found it hard to think of San Francisco as remote when the time difference was so small, and even when I’d made the journey north to visit her, it had been nothing compared to all the ocean-spanning marathons I’d sat through in the past.

  We both had an early night. The next morning, Francine accompanied me to my cluttered workroom in the basement of the university’s engineering department. I’d been chasing grants and collaborators around the world, like a child on a treasure hunt, slowly piecing together a device that few of my colleagues believed was worth creating for its own sake. Fortunately, I’d managed to find pretexts – or even genuine spin-offs – for almost every stage of the work. Quantum computing, per se, had become bogged down in recent years, stymied by both a shortage of practical algorithms and a limit to the complexity of superpositions that could be sustained. The Qusp had nudged the technological envelope in some promising directions, without making any truly exorbitant demands; the states it juggled were relatively simple, and they only needed to be kept isolated for milliseconds at a time.

  I introduced Carlos, Maria and Jun, but then they made themselves scarce as I showed Francine around. We still had a demonstration of the “balanced decoupling” principle set up on a bench, for the tour by one of our corporate donors the week before. What caused an imperfectly shielded quantum computer to decohere was the fact that each possible state of the device affected its environment slightly differently. The shielding itself could always be improved, but Carlos’s group had perfected a way to buy a little more protection by sheer deviousness. In the demonstration rig, the flow of energy through the device remained absolutely constant whatever state it was in, because any drop in power consumption by the main set of quantum gates was compensated for by a rise in a set of balancing gates, and vice versa. This gave the environment one less clue by which to discern internal differences in the processor, and to tear any superposition apart into mutually disconnected branches.

  Francine knew all the theory backward, but she’d never seen this hardware in action. When I invited her to twiddle the controls, she took to the rig like a child with a game console.

  “You really should have joined the team,” I said.

  “Maybe I did,” she countered. “In another branch.”

  She’d moved from UNSW to Berkeley two years before, not long after I’d moved from Delft to São Paulo; it was the closest suitable position she could find. At the time, I’d resented the fact that she’d refused to compromise and work remotely; with only five hours’ difference, teaching at Berkeley from São Paulo would not have been impossible. In the end, though, I’d accepted the fact that she’d wanted to keep on testing me, testing both of us. If we weren’t strong enough to stay together through the trials of a prolonged physical separation – or if I was not sufficiently committed to the project to endure whatever sacrifices it entailed – she did not want us proceeding to the next stage.

  I led her to the corner bench, where a nondescript gray box half a meter across sat, apparently inert. I gestured to it, and our retinal overlays transformed its appearance, “revealing” a maze with a transparent lid embedded in the top of the device. In one chamber of the maze, a slightly cartoonish mouse sat motionless. Not quite dead, not quite sleeping.

  “This is the famous Zelda?” Francine asked.

  “Yes.” Zelda was a neural network, a stripped-down, stylized mouse brain. There were newer, fancier versions available, much closer to the real thing, but the ten-year-old, public domain Zelda had been good enough for our purposes.

  Three other chambers held cheese. “Right now, she has no experience of the maze,” I explained. “So let’s start her up and watch her explore.” I gestured, and Zelda began scampering around, trying out different passages, deftly reversing each time she hit a cul-de-sac. “Her brain is running on a Qusp, but the maze is implemented on an ordinary classical computer, so in terms of coherence issues, it’s really no different from a physical maze.”

  “Which means that each time she takes in information, she gets entangled with the outside world,” Francine suggested.

  “Absolutely. But she always holds off doing that until the Qusp has completed its current computational step, and every qubit contains a definite zero or a definite one. She’s never in two minds when she lets the world in, so the entanglement process doesn’t split her into separate branches.”

  Francine continued to watch, in silence. Zelda finally found one of the chambers containing a reward; when she’d eaten it, a hand scooped her up and returned her to her starting point, then replaced the cheese.

  “Here are ten thousand previous trials, superimposed.” I replayed the data. It looked as if a single mouse was running through the
maze, moving just as we’d seen her move when I’d begun the latest experiment. Restored each time to exactly the same starting condition, and confronted with exactly the same environment, Zelda – like any computer program with no truly random influences – had simply repeated herself. All ten thousand trials had yielded identical results.

  To a casual observer, unaware of the context, this would have been a singularly unimpressive performance. Faced with exactly one situation, Zelda the virtual mouse did exactly one thing. So what? If you’d been able to wind back a flesh-and-blood mouse’s memory with the same degree of precision, wouldn’t it have repeated itself too?

  Francine said, “Can you cut off the shielding? And the balanced decoupling?”

  “Yep.” I obliged her, and initiated a new trial.

  Zelda took a different path this time, exploring the maze by a different route. Though the initial condition of the neural net was identical, the switching processes taking place within the Qusp were now opened up to the environment constantly, and superpositions of several different eigenstates – states in which the Qusp’s qubits possessed definite binary values, which in turn led to Zelda making definite choices – were becoming entangled with the outside world. According to the Copenhagen interpretation of quantum mechanics, this interaction was randomly “collapsing” the superpositions into single eigenstates; Zelda was still doing just one thing at a time, but her behavior had ceased to be deterministic. According to the MWI, the interaction was transforming the environment – Francine and me included – into a superposition with components that were coupled to each eigenstate; Zelda was actually running the maze in many different ways simultaneously, and other versions of us were seeing her take all those other routes.

  Which scenario was correct?

  I said, “I’ll reconfigure everything now, to wrap the whole setup in a Delft cage.” A “Delft cage” was jargon for the situation I’d first read about seventeen years before: instead of opening up the Qusp to the environment, I’d connect it to a second quantum computer, and let that play the role of the outside world.

  We could no longer watch Zelda moving about in real time, but after the trial was completed, it was possible to test the combined system of both computers against the hypothesis that it was in a pure quantum state in which Zelda had run the maze along hundreds of different routes, all at once. I displayed a representation of the conjectured state, built up by superimposing all the paths she’d taken in ten thousand unshielded trials.

  The test result flashed up: CONSISTENT.

  “One measurement proves nothing,” Francine pointed out.

  “No.” I repeated the trial. Again, the hypothesis was not refuted. If Zelda had actually run the maze along just one path, the probability of the computers’ joint state passing this imperfect test was about one percent. For passing it twice, the odds were about one in ten thousand.

  I repeated it a third time, then a fourth.

  Francine said, “That’s enough.” She actually looked queasy. The image of the hundreds of blurred mouse trails on the display was not a literal photograph of anything, but if the old Delft experiment had been enough to give me a visceral sense of the reality of the multiverse, perhaps this demonstration had finally done the same for her.

  “Can I show you one more thing?” I asked.

  “Keep the Delft cage, but restore the Qusp’s shielding?”

  “Right.”

  I did it. The Qusp was now fully protected once more whenever it was not in an eigenstate, but this time, it was the second quantum computer, not the outside world, to which it was intermittently exposed. If Zelda split into multiple branches again, then she’d only take that fake environment with her, and we’d still have our hands on all the evidence.

  Tested against the hypothesis that no split had occurred, the verdict was: CONSISTENT. CONSISTENT. CONSISTENT.

  #

  We went out to dinner with the whole of the team, but Francine pleaded a headache and left early. She insisted that I stay and finish the meal, and I didn’t argue; she was not the kind of person who expected you to assume that she was being politely selfless, while secretly hoping to be contradicted.

  After Francine had left, Maria turned to me. “So you two are really going ahead with the Frankenchild?” She’d been teasing me about this for as long as I’d known her, but apparently she hadn’t been game to raise the subject in Francine’s presence.

  “We still have to talk about it.” I felt uncomfortable myself, now, discussing the topic the moment Francine was absent. Confessing my ambition when I applied to join the team was one thing; it would have been dishonest to keep my collaborators in the dark about my ultimate intentions. Now that the enabling technology was more or less completed, though, the issue seemed far more personal.

  Carlos said breezily, “Why not? There are so many others now. Sophie. Linus. Theo. Probably a hundred we don’t even know about. It’s not as if Ben’s child won’t have playmates.” Adai – Autonomously Developing Artificial Intelligences – had been appearing in a blaze of controversy every few months for the last four years. A Swiss researcher, Isabelle Schib, had taken the old models of morphogenesis that had led to software like Zelda, refined the technique by several orders of magnitude, and applied it to human genetic data. Wedded to sophisticated prosthetic bodies, Isabelle’s creations inhabited the physical world and learned from their experience, just like any other child.

  Jun shook his head reprovingly. “I wouldn’t raise a child with no legal rights. What happens when you die? For all you know, it could end up as someone’s property.”

  I’d been over this with Francine. “I can’t believe that in ten or twenty years’ time there won’t be citizenship laws, somewhere in the world.”

  Jun snorted. “Twenty years! How long did it take the U.S. to emancipate their slaves?”

  Carlos interjected, “Who’s going to create an adai just to use it as a slave? If you want something biddable, write ordinary software. If you need consciousness, humans are cheaper.”

  Maria said, “It won’t come down to economics. It’s the nature of the things that will determine how they’re treated.”

  “You mean the xenophobia they’ll face?” I suggested.

  Maria shrugged. “You make it sound like racism, but we aren’t talking about human beings. Once you have software with goals of its own, free to do whatever it likes, where will it end? The first generation makes the next one better, faster, smarter; the second generation even more so. Before we know it, we’re like ants to them.”

  Carlos groaned. “Not that hoary old fallacy! If you really believe that stating the analogy ‘ants are to humans, as humans are to x’ is proof that it’s possible to solve for x, then I’ll meet you where the south pole is like the equator.”

  I said, “The Qusp runs no faster than an organic brain; we need to keep the switching rate low, because that makes the shielding requirements less stringent. It might be possible to nudge those parameters, eventually, but there’s no reason in the world why an adai would be better equipped to do that than you or I would. As for making their own offspring smarter … even if Schib’s group has been perfectly successful, they will have merely translated human neural development from one substrate to another. They won’t have ‘improved’ on the process at all – whatever that might mean. So if the adai have any advantage over us, it will be no more than the advantage shared by flesh-and-blood children: cultural transmission of one more generation’s worth of experience.”

  Maria frowned, but she had no immediate comeback.

  Jun said dryly, “Plus immortality.”

  “Well, yes, there is that,” I conceded.

  #

  Francine was awake when I arrived home.

  “Have you still got a headache?” I whispered.

  “No.”

  I undressed and climbed into bed beside her.

  She said, “You know what I miss the most? When we’re fucking on-line?”
/>
  “This had better not be complicated; I’m out of practice.”

  “Kissing.”

  I kissed her, slowly and tenderly, and she melted beneath me. “Three more months,” I promised, “and I’ll move up to Berkeley.”

  “To be my kept man.”

  “I prefer the term ‘unpaid but highly valued caregiver.’” Francine stiffened. I said, “We can talk about that later.” I started kissing her again, but she turned her face away.

  “I’m afraid,” she said.

  “So am I,” I assured her. “That’s a good sign. Everything worth doing is terrifying.”

  “But not everything terrifying is good.”

  I rolled over and lay beside her. She said, “On one level, it’s easy. What greater gift could you give a child, than the power to make real decisions? What worse fate could you spare her from, than being forced to act against her better judgment, over and over? When you put it like that, it’s simple.

  “But every fiber in my body still rebels against it. How will she feel, knowing what she is? How will she make friends? How will she belong? How will she not despise us for making her a freak? And what if we’re robbing her of something she’d value: living a billion lives, never being forced to choose between them? What if she sees the gift as a kind of impoverishment?”

  “She can always drop the shielding on the Qusp,” I said. “Once she understands the issues, she can choose for herself.”

  “That’s true.” Francine did not sound mollified at all; she would have thought of that long before I’d mentioned it, but she wasn’t looking for concrete answers. Every ordinary human instinct screamed at us that we were embarking on something dangerous, unnatural, hubristic – but those instincts were more about safeguarding our own reputations than protecting our child-to-be. No parent, save the most willfully negligent, would be pilloried if their flesh-and-blood child turned out to be ungrateful for life; if I’d railed against my own mother and father because I’d found fault in the existential conditions with which I’d been lumbered, it wasn’t hard to guess which side would attract the most sympathy from the world at large. Anything that went wrong with our child would be grounds for lynching – however much love, sweat, and soul-searching had gone into her creation – because we’d had the temerity to be dissatisfied with the kind of fate that everyone else happily inflicted on their own.