Artifacts
And this may be a travesty of life, now—but there’s always the chance of improvement. Maybe I can persuade Durham to restore my communications facilities; that would be a start. And when I get bored with holovision libraries; news systems; databases; and, if any of them deign to meet me, the ghosts of the senile rich? I could have myself suspended until processor speeds catch up with reality—when people will be able to visit without slow-down, and telepresence robots might actually be worth inhabiting.
I open my eyes, and shiver. I don’t know what I want anymore—the chance to bail out, to declare this bad dream over…or the chance of virtual immortality—but I have to accept that there’s only one way that I’m going to be given a choice.
I say quietly, “I won’t be your guinea pig. A collaborator, yes. An equal partner. If you want cooperation, if you want meaningful data, then you’re going to have to treat me like a colleague, not a piece of fucking apparatus. Understood?”
A window opens up in front of me. I’m shaken by the sight, not of his ugly face, but of the room behind him. It’s only my study—and I wandered through the virtual equivalent, disinterested, just minutes ago—but this is still my first glimpse of the real world, in real time. I move closer to the window, in the hope of seeing if there’s anyone else in the room with him—Elizabeth?—but the image is two-dimensional, the perspective doesn’t change.
He emits a brief, high-pitched squeak, then waits with visible impatience while a second, smaller window gives me a slowed-down replay.
“Of course it’s understood. That was always my intention. I’m just glad you’ve finally come to your senses and decided to stop sulking. We can begin whenever you’re ready.”
I try to look at things objectively.
Every Copy is already an experiment—in perception, cognition, the nature of consciousness. A sub-cellular mathematical model of a specific human body is a spectacular feat of medical imaging and computing technology—but it’s certainly not itself a human being. A lump of gallium arsenic phosphide awash with laser light is not a member of Homo sapiens—so a Copy manifestly isn’t “human” in the current sense of the word.
The real question is: What does a Copy have in common with human beings? Information-theoretically? Psychologically? Metaphysically?
And from these similarities and differences, what can be revealed?
The Strong AI Hypothesis declares that consciousness is a property of certain algorithms, independent of their implementation. A computer which manipulates data in essentially the same way as an organic brain must possess essentially the same mental states.
Opponents point out that when you model a hurricane, nobody gets wet. When you model a fusion power plant, no energy is produced. When you model digestion and metabolism, no nutrients are consumed—no real digestion takes place. So when you model the human brain, why should you expect real thought to occur?
It depends, of course, on what you mean by “real thought.” How do you characterize and compare the hypothetical mental states of two systems which are, physically, radically dissimilar? Pick the right parameters, and you can get whatever answer you like. If consciousness is defined purely in terms of physiological events—actual neurotransmitter molecules crossing synapses between real neurons—then those who oppose the Strong AI Hypothesis win, effortlessly. A hurricane requires real wind and actual drops of rain. If consciousness is defined, instead, in information-processing terms—this set of input data evokes that set of output data (and, perhaps, a certain kind of internal representation)—then the Strong AI Hypothesis is almost a tautology.
Personally, I’m no longer in a position to quibble. Cogito ergo sum. But if I can’t doubt my own consciousness, I can’t expect my testimony—the output of a mere computer program—to persuade the confirmed skeptics. Even if I passionately insisted that my inherited memories of experiencing biological consciousness were qualitatively indistinguishable from my present condition, the listener would be free to treat this outburst as nothing but a computer’s (eminently reasonable) prediction of what my original would have said, had he experienced exactly the same sensory input as my model-of-a-brain has received (and thus been tricked into believing that he was nothing but a Copy). The skeptics would say that comprehensive modeling of mental states that might have been does not require any “real thought” to have taken place.
Unless you are a Copy, the debate is unresolvable. For me, though—and for anyone willing to grant me the same presumption of consciousness that they grant their fellow humans—the debate is almost irrelevant. The real point is that there are questions about the nature of this condition which a Copy is infinitely better placed to explore than any human being.
I sit in my study, in my favorite armchair (although I’m not at all convinced that the texture of the surface has been accurately reproduced). Durham appears on my terminal—which is otherwise still dysfunctional. It’s odd, but I’m already beginning to think of him as a bossy little djinn trapped inside the screen, rather than a vast, omnipotent deity striding the halls of Reality, pulling all the strings. Perhaps the pitch of his voice has something to do with it.
Squeak. Slow-motion replay: “Experiment one, trial zero. Baseline data. Time resolution one millisecond—system standard. Just count to ten, at one-second intervals, as near as you can judge it. Okay?”
I nod, irritated. I planned all this myself, I don’t need step-by-step instructions. His image vanishes; during the experiments, there can’t be any cues from real time.
I count. Already, I’m proving something: my subjective time, I’m sure, will differ from his by a factor very close to the ratio of model time to real time. Of course, that’s been known ever since the first Copies were made—and even then, it was precisely what everyone had been expecting—but from my current perspective, I can no longer think of it as a “trivial” result.
The djinn returns. Staring at his face makes it harder, not easier, to believe that we have so much in common. My image of myself—to the extent that such a thing existed—was never much like my true appearance—and now, in defense of sanity, is moving even further away.
Squeak. “Okay. Experiment one, trial number one. Time resolution five milliseconds. Are you ready?”
“Yes.”
He vanishes. I count: “One. Two. Three. Four. Five. Six. Seven. Eight. Nine. Ten.”
Squeak. “Anything to report?”
I shrug. “No. I mean, I can’t help feeling slightly apprehensive, just knowing that you’re screwing around with my … infrastructure. But apart from that, nothing.”
His eyes no longer glaze over while he’s waiting for the speeded-up version of my reply; either he’s gained a degree of self-discipline—or, more likely, he’s interposed some smart editing software to conceal his boredom.
Squeak. “Don’t worry about apprehension. We’re running a control, remember?”
I’d rather not. Durham has cloned me, and he’s feeding exactly the same sensorium to my clone, but he’s only making changes in the model’s time resolution for one of us. A perfectly reasonable thing to do—indeed, an essential part of the experiment—but it’s still something I’d prefer not to dwell on.
Squeak. “Trial number two. Time resolution ten milliseconds.”
I count to ten. The easiest thing in the world—when you’re made of flesh, when you’re made of matter, when the quarks and the electrons just do what comes naturally. I’m not built of quarks and electrons, though. I’m not even built of photons—I’m comprised of the data represented by the presence or absence of pulses of light, not the light itself.
A human being is embodied in a system of continuously interacting matter—ultimately, fields of fundamental particles, which seem to me incapable of being anything other than themselves. I am embodied in a vast set of finite, digital representations of numbers. Representations which are purely conventions. Numbers which certainly can be interpreted as describing aspects of a model of a human body sitting
in a room … but it’s hard to see that meaning as intrinsic, as necessary. Numbers whose values are recomputed—according to reasonable, but only approximately “physical,” equations—for equally spaced successive values of the model’s notional time.
Squeak. “Trial number three. Time resolution twenty milliseconds.”
“One. Two. Three.”
So, when do I experience existence? During the computation of these variables—or in the brief interludes when they sit in memory, unchanging, doing nothing but representing an instant of my life? When both stages are taking place a thousand times a subjective second, it hardly seems to matter, but very soon—
Squeak. “Trial number four. Time resolution fifty milliseconds.”
Am I the data? The process that generates it? The relationships between the numbers? All of the above?
“One hundred milliseconds.”
I listen to my voice as I count—as if half expecting to begin to notice the encroachment of silence, to start perceiving the gaps in myself.
“Two hundred milliseconds.”
A fifth of a second. “One. Two.” Am I strobing in and out of existence now, at five subjective hertz? “Three. Four. Sorry, I just—” An intense wave of nausea passes through me, but I fight it down. “Five. Six. Seven. Eight. Nine. Ten.”
The djinn emits a brief, solicitous squeak. “Do you want a break?”
“No. I’m fine. Go ahead.” I glance around the sun-dappled room, and laugh. What will he do if the control and the subject just give two different replies? I try to recall my plans for such a contingency, but I can’t remember them—and I don’t much care. It’s his problem now, not mine.
Squeak. “Trial number seven. Time resolution five hundred milliseconds. “
I count—and the truth is, I feel no different. A little uneasy, yes—but factoring out any metaphysical squeamishness, everything about my experience remains the same. And “of course” it does—because nothing is being omitted, in the long run. My model-of-a-brain is only being fully described at half-second (model time) intervals—but each description still includes the effects of everything that “would have happened” in between. Perhaps not quite as accurately as if the complete cycle of calculations was being carried out on a finer time scale—but that’s irrelevant. Even at millisecond resolution, my models-of-neurons behave only roughly like their originals—just as any one person’s neurons behave only roughly like anyone else’s. Neurons aren’t precision components, and they don’t need to be; brains are the most fault-tolerant machines in the world.
“One thousand milliseconds.”
What’s more, the equations controlling the model are far too complex to solve in a single step, so in the process of calculating the solutions, vast arrays of partial results are being generated and discarded along the way. These partial results imply—even if they don’t directly represent—events taking place within the gaps between successive complete descriptions. So in a sense, the intermediate states are still being described—albeit in a drastically recoded form.
“Two thousand milliseconds.”
“One. Two. Three. Four.”
If I seem to speak (and hear myself speak) every number, it’s because the effects of having said “three” (and having heard myself say it) are implicit in the details of calculating how my brain evolves from the time when I’ve just said “two” to the time when I’ve just said “four.”
“Five thousand milliseconds.”
“One. Two. Three. Four. Five.”
In any case, is it so much stranger to hear words that I’ve never “really” spoken, than it has been to hear anything at all since I woke? Millisecond sampling is far too coarse to resolve the full range of audible tones. Sound isn’t represented in this world by fluctuations in air pressure values—which couldn’t change fast enough—but in terms of audio power spectra: profiles of intensity versus frequency. Twenty kilohertz is just a number here, a label; nothing can actually oscillate at that rate. Real ears analyze pressure waves into components of various pitch; mine are fed the pre-existing power spectrum values directly, plucked out of the non-existent air by a crude patch in the model.
“Ten thousand milliseconds.”
“One. Two. Three.”
My sense of continuity remains as compelling as ever. Is this experience arising in retrospect from the final, complete description of my brain…or is it emerging from the partial calculations as they’re being performed? What would happen if someone shut down the whole computer, right now?
I don’t know what that means, though. In any terms but my own, I don’t know when “right now” is.
“Eight. Nine. Ten.”
Squeak. “How are you feeling?”
Slightly giddy—but I shrug and say, “The same as always.” And basically, it’s true. Aside from the unsettling effects of contemplating what might or might not have been happening to me, I can’t claim to have experienced anything out of the ordinary. No altered states of consciousness, no hallucinations, no memory loss, no diminution of self-awareness, no real disorientation. “Tell me—was I the control, or the subject?”
Squeak. He grins. “I can’t answer that, Paul—I’m still speaking to both of you. I’ll tell you one thing, though: the two of you are still identical. There were some very small, transitory discrepancies, but they’ve died away completely now—and whenever the two of you were in comparable representations, all firing patterns of more than a couple of neurons were the same.”
I’m curiously disappointed by this—and my clone must be, too—although I have no good reason to be surprised.
I say, “What did you expect? Solve the same set of equations two different ways, and of course you get the same results—give or take some minor differences in round-off errors along the way. You must. It’s a mathematical certainty.”
Squeak. “Oh, I agree. However much we change the details of the way the model is computed, the state of the subject’s brain—whenever he has one—and everything he says and does—in whatever convoluted representation—must match the control. Any other result would be unthinkable.” He writes with his finger on the window:
(1 + 2) + 3 = 1 + (2 + 3)
I nod. “So why bother with this stage at all? I know—I wanted to be rigorous, I wanted to establish solid foundations. All that naive Principia stuff. But the truth is, it’s a waste of resources. Why not skip the bleeding obvious, and get on with the kind of experiment where the answer isn’t a foregone conclusion?”
Squeak. He frowns. “I didn’t realize you’d grown so cynical, so quickly. AI isn’t a branch of pure mathematics; it’s an empirical science. Assumptions have to be tested. Confirming the so-called ‘obvious’ isn’t such a dishonorable thing, is it? Anyway, if it’s all so straightforward, what do you have to fear?”
I shake my head. “I’m not afraid; I just want to get it over with. Go ahead. Prove whatever you think you have to prove, and then we can move on.”
Squeak. “That’s the plan. But I think we should both get some rest now. I’ll enable your communications—for incoming data only.” He turns away, reaches off-screen, hits a few keys on a second terminal.
Then he turns back to me, smiling—and I know exactly what he’s going to say.
Squeak. “By the way, I just deleted one of you. Couldn’t afford to keep you both running, when all you’re going to do is laze around.”
I smile back at him, although something inside me is screaming. “Which one did you terminate?”
Squeak. “What difference does it make? I told you, they were identical. And you’re still here, aren’t you? whoever you are. Whichever you were.”
Three weeks have passed outside since the day of the scan, but it doesn’t take me long to catch up with the state of the world; most of the fine details have been rendered irrelevant by subsequent events, and much of the ebb and flow has simply canceled itself out. Israel and Palestine came close to war again, over alleged water treaty violations
on both sides—but a joint peace rally brought more than a million people onto the glassy plain that used to be Jerusalem, and the governments were forced to back down. Former US President Martin Sandover is still fighting extradition to Palau, to face charges arising from his role in the bloody coup d’état of thirty-five; the Supreme Court finally reversed a long-standing ruling which had granted him immunity from all foreign laws, and for a day or two things looked promising—but then his legal team apparently discovered a whole new set of delaying tactics. In Canberra, another leadership challenge has come and gone, with the Prime Minister undeposed. One journalist described this as high drama; I guess you had to be there. Inflation has fallen half a percent; unemployment has risen by the same amount.
I scan through the old news reports rapidly, skimming over articles and fast-forwarding scenes that I probably would have studied scrupulously, had they been “fresh.” I feel a curious sense of resentment, at having “missed” so much—it’s all here in front of me, now, but that’s not the same at all.
And yet, shouldn’t I be relieved that I didn’t waste my time on so much ephemeral detail? The very fact that I’m now disinterested only goes to show how little of it really mattered, in the long run.
Then again, what does? People don’t inhabit geological time. People inhabit hours and days; they have to care about things on that time scale.
People inhabit hours and days. I don’t.
I plug into real time holovision, and watch a sitcom flash by in less than two minutes, the soundtrack an incomprehensible squeal. A game show. A war movie. The evening news. It’s as if I’m in deep space, rushing back toward the Earth through a sea of Doppler-shifted broadcasts—and this image is strangely comforting: my situation isn’t so bizarre, after all, if real people could find themselves in much the same relationship with the world as I am. Nobody would claim that Doppler shift or time dilation could render someone less than human.
Dusk falls over the recorded city. I eat a microwaved soya protein stew—wondering if there’s any good reason now, moral or otherwise, to continue to be a vegetarian.