When they did face this fact, Bickel saw, he was going to have a fight on his hands. Too much of the ship was almost totally dependent on the master programs. Juggling those programs involved a kind of all-or-nothing danger. It was a flaw in the Tin Egg’s design, Bickel felt. He could see no logical reason for it. Why should everything on the ship depend on conscious control or intervention—even down to the robox repair units?
Prudence sensed Bickel’s attention on her, saw his face reflected in a gauge’s plastic cover. His questionings, doubts, and determination were all there for her to read just as surely as she read the dial beneath the plastic reflector. She had set him up—she had done that part of her job as well as could be expected, she thought. She focused now on the total console, feeling the sensory pulses of the ship reaching outward to the hull skin and beyond.
Job routine was beginning to smooth off the harsh edges of her fear. She took a deep breath, keyed a forward exterior sensor to the overhead screen, studied the star-spangled view of what lay ahead of the Tin Egg.
That’s our prize, she thought, looking at the stars. First, we clean out the Augean stables—then we get to be first … out there. The candy and the stick. That’s the candy, a virgin world of our own (and we have our tanks full of colonists to prove Earth’s good faith) and I … I am the stick.
The screenview appeared suddenly repulsive to her, and she blanked it, returned her attention to the big board and its pressures.
It’s the uncertainty that gets to us, she thought. There’s too much unknown out here—something has to go wrong. But we don’t know what it’ll be … or when it’ll hit. We only know the blow when it falls can be totally destructive, leaving not a trace. It has been before—six times.
She heard Bickel and Timberlake leave, the hiss of the hatch expanders sealing behind them; she turned and looked at Flattery. He had a small blue smudge-stain on his cheek just below his left eye. The stain appeared suddenly as an enormous flaw in an otherwise perfect creature. It terrified her, and she turned back to the big board to hide her emotion.
“Why … why did the other six fail?” she asked.
“You must have faith,” Flattery said. “One ship will make it … one day. Perhaps it’ll be our ship.”
“It seems such a … wasteful way,” she murmured.
“Very little’s wasted. Solar energy’s cheap at Moonbase. Raw materials are plentiful.”
“But we’re … alive!” she protested.
“There are plenty more where we came from. They’ll be almost precisely like us … and all of them God’s children. His eye is ever on us. We should—”
“Oh, stop that! I know why we have a chaplain—to feed us that pap when we need it. I don’t need it and I never will.”
“How proud we are,” Flattery said.
“You know what you can do with your metaphysical crap. There is no God, only—”
“Shut up!” he barked. “I speak as your chaplain. I’m surprised at your stupidity, the temerity that permits you to utter such blasphemy out here.”
“Oh, yes,” she sneered. “I forgot. You’re also our wily Indian scout sniffing the unknown terrain in front of us. You’re the hedge on our bets, the ‘what-if’ factor, the—”
“You have no idea how much unknown we face,” he said.
“Right out of Hamlet, she mocked him, and allowed her voice to go heavy with portentousness: “There are more things in heaven and earth, Horatio, than are dreamt of in your philosophy.’”
He felt an abrupt pang of fear for her. “I’ll pray for you, Prudence.” And he cursed inwardly at the sound of his own voice. He had come through as a fatuous ass. But I will pray for her, he thought.
Prudence turned back to the big board, reminding herself: A stick is to beat people with … to goad them beyond themselves. Raj can’t just be a chaplain; he has to be a super-chaplain.
Flattery took a deep, quavering breath. Her blasphemy had touched his most profound doubts. And he thought how little anyone suspected what lay beneath their veneer of science, deep in that Pandora’s box where anything was possible.
Anything? he asked himself.
That was the bind, of course. They were penetrating the frontiers of Anything … and Anything had always before been the prerogative of God.
Chapter 11
Symbolic behavior of some order has to be a requisite of consciousness. And it must be noted that symbols abstract—they reduce a message to selected form.
—Morgan Hempstead, Lectures at Moonbase
“Spread out that software on the bench, Tim,” Bickel directed. “Start by putting the pertinent parts of the loading plan on top of what we need’ll be in robot stores. I’ll be with you in a minute.”
Timberlake looked at Bickel’s back. Control had passed so obviously into the man’s hands. No one questioned it … now. He shrugged, began laying out the manifests and loading plans.
Bickel glanced around the room.
The computer maintenance shop was designed in such a way that Com-central nested partly into the curve of one wall. The shop presented one flat wall opposite Com-central, a wall about four and a half meters high and ten meters long—its face covered with plugboards, comparators, simultaneous multiplexers, buffer-system monitors, diagnostic instruments—dials and telltales.
Behind that wall’s hardware and shields lay the first banks of master-program routing that led down to core memory sections and the vast library of routines that marked out the limits of the equipment.
“We’ll have to block-sort the system to find all the audio and visual links and the AAT bands,” Bickel said. “It’s going to be a bootstrap operation all the way and the only information going back into the system will have to come from us. That means one of us will have to monitor the readout at all times. We’ll have to sort out the garbage as we go and keep a running check on every control sequence we use. Let’s start with a gate-circuit system right here.” Bickel indicated an optical character reader on the wall directly in front of him.
It was all clear to him—this entrance into the problem. If only he could keep this gate of his own awareness open—one step at a time.
But there remained the weight of those six previous failures … reasons unknown: more than eighteen thousand people lost.
They don’t think of us as real people, Bickel told himself. We’re expendable components, easily replaced.
What happened with the other six ships?
He wiped perspiration from his hands.
The conference hookups with station personnel had served only to frustrate him. He remembered sitting at his pickup desk staring into the vid-eye screen across his ink stained blotter, watching the movement of faces in the screen divisions—faces he knew only in an untouchable, secondhand way.
The memory was dominated by Hempstead’s voice issuing from that harsh wide mouth with its even rows of teeth:
“Any theory introduced to explain the loss of those ships must remain a theory at present. In the final analysis, we must admit we simply do not know what happened. We can only guess.”
Guesses:
System failure.
Mechanical failure.
Human failure.
And subdivisions within subdivisions to break down the rows of guesses.
But never a word of suspicion about the Organic Mental Cores. Not one hint or theory or guess. The brains were perfect.
“Why?” Bickel muttered, staring at the gauges of the computer panel.
The stacked schematics on the bench rustled as Timberlake looked up. “What?”
“Why didn’t they suspect OMC failure?” Bickel asked.
“Stupid mistake.”
“That’s too pat,” Bickel protested. “There’s something … some overriding reason we weren’t given all the facts.” He approached the computer panel, wiped away a small smudged fingerprint.
“What’re you getting at?” Timberlake asked.
“Think how easy it w
as to keep a secret from us. Everything we did or said or breathed or ate was under their absolute control. We’re the orbiting orphans, remember? Sterile isolation. The story of our lives: sterile isolation—physical … and mental.”
“That’s not reasonable,” Timberlake said. “There’re good reasons for sterile isolation, big advantages in a germ-free ship. But if you keep information from people who need it … well, that’s not optimum.”
“Don’t you ever get tired of being manipulated?” Bickel asked.
“Ahhhh, they wouldn’t.”
“Wouldn’t they?”
“But …”
“What do we really know about Tau Ceti Project?” Bickel asked. “Only what we’ve been told. Automatic probes were sent out. They say they found this one habitable planet circling Tau Ceti. So UMB began sending ships.”
“Well, why not?” Timberlake asked.
“Lots of reasons why not.”
“You’re too damn suspicious.”
“Sure I am. They tell us that because of the dangers, they send only duplicate-humans … Doppelgangers.”
“It makes sense,” Timberlake said.
“You don’t see anything suspicious in this setup?”
“Hell, no!”
“I see.” Bickel turned away from the glistening face of the computer panel, scowled at Timberlake. “Then let’s try another tack.
Don’t you find it at all difficult to focus on this problem of consciousness?”
“On what?”
“We have to make an artificial consciousness,” Bickel said. “That’s our main chance. Project knows it … so do we. Do you find it difficult to face this problem?’
“What problem?”
“You don’t think it’ll be much of a problem manufacturing an artificial consciousness?”
“Well …”
“Your life depends on solving it,” Bickel said.
“I guess so.”
“You guess so! D’you have an alternative plan?”
“We could turn back.”
Bickel fought down a surge of anger. “None of you see it!”
“See what?”
“The Tin Egg’s almost totally dependent on computer function. The AAT system uses computer translation banks. All our ship sensors are sorted through the computer for priority of presentation on Com-central’s screens. Every living soul in the hyb tanks has an individual life-system program—through the computer. The drive is computer governed. The crew life systems, the shields, the fail-safe circuits, hull integrity, the radiation reflectors …”
“Because everything was supposed to be left under the control of an OMC.”
Bickel crossed the shop in one low-gravity step, slapped a hand onto the papers on the bench. The movement sent several papers fluttering to the deck, but he ignored them. “And all the brains on six—no, seven!—ships failed! I can feel it right in my guts. The OMCs failed … and we weren’t given one word of warning.”
Timberlake started to speak, thought better of it. He bent, collected the schematics from the deck, replaced them on the bench. Something about the force of Bickel’s words, some product of vehemence prevented argument.
He’s right, Timberlake thought.
Timberlake looked up at Bickel, noting the perspiration on the man’s forehead, the frown lines at the corners of his eyes. “We still could turn back,” Timberlake said.
“I don’t think we can. This is a one-way trip.”
“Why not? If we headed back …”
“And had a computer malfunction?”
“We’d still be headed home.”
“You call diving into the sun home?”
Timberlake wet his lips with his tongue.
“They used to teach kids to swim by tossing them into a lake,” Bickel said. “Well, we’ve been tossed into the lake. We’d better start swimming, or sure as hell we’re going to sink.”
“Project wouldn’t do that to us,” Timberlake whispered.
“Oh, wouldn’t they?”
“But … six ships … more than eighteen thousand people …”
“People? What people? The only losses I know about are ‘Gangers, fairly easy to replace if you have a cheap energy source.”
“We’re people,” Timberlake said, “not just Doppelgangers.”
“To us we’re people,” Bickel said. “Now, I’ve a real honey of a question for you—considering all those previous ship failures and the numerous possibilities of malfunction: Why didn’t Project give us a code for talking about failure of OMCs, ours … or any others?”
“These suspicions are … crazy,” Timberlake said.
“Yeah,” Bickel said. “We’re really on our way to Tau Ceti. Our lives are totally dependent on an all-or-nothing computer system—quite by the merest oversight. We’ve aimed ships like ours all over the sky—at Dubhe, at Schedar, at Hamal, at—”
“There was always the off chance those other six ships made it. You know that. They disappeared, sure, but—”
“Ahhhh, now we get down to the meat. Maybe they weren’t failures, eh? Maybe they—”
“It wouldn’t make sense to send two seeding ships to the same destination,” Timberlake pointed out. “Not if you weren’t sure what was happening to—”
“You really believe that, Tim?”
“Well …”
“I have a better suggestion, Tim. If some crazy bastard tossed you into a lake when you couldn’t swim, and you learned to swim like that”—Bickel snapped his fingers—”and you found then you could just keep on going, wouldn’t you swim like hell to get away from the crazy bastard?”
Chapter 12
DEMAND: Define God.
OMC: The whole is greater than the sum of its parts.
DEMAND: How can God contain the universe?
OMC: Study the hologram. The individual is both laser and target.
—Fragment from Message Capsule #4, thought to have originated with Flattery (#4B) model
In Com-central, the sounds were those the umbilicus crew had come to accept as normal—the creak of action couches in their gimbals, the click of an occasional relay as it called attention to a telltale on the big board.
“Has Bickel unburdened himself at all about the artificial consciousness project back at UMB?” Prudence asked.
She removed her attention momentarily from the master console, glanced at Flattery, her sole companion on the lonely watch. Flattery appeared a bit pale, his mouth drawn downward in a frown. She returned her attention to the console, noting on the time log that her shipwatch had a little more than an hour yet to run. The strain was beginning to drag at her energy reserves. Flattery was taking a hell of a long time to answer, she thought … but he was famous for the ponderous reply.
“He’s said a little,” Flattery said, and he glanced at the hatch to the computer maintenance shop where Bickel and Timberlake we’re working. “Prue, shouldn’t we be listening in on them, making sure they—”
“Not yet,” she said.
“They wouldn’t have to know we were listening.”
“You underestimate Bickel,” she said. “That’s about the worst mistake you can make. He’s fully capable of throwing a trace meter onto the communications—as I have—just on the off chance something interesting’ll turn up … like finding us listening.”
“D’you think he’s started … building?’
“Mostly preparation at this stage,” she said. “They’re collecting material. You can pretty well follow their movements by watching the power drain here on the board, the shifts in temperature sensors and the dosimeter repeaters and the drain on the robox cargo handlers.”
“They’ve been out into the cargo sections?”
“One of them has … probably Tim.”
“You know what Bickel said about the UMB attempt?” Flattery asked. He paused to scratch an itch under his chin. “Said the biggest failure was in attention—the experts wandering away, doing everything but keeping their a
ttention on the main line.”
“That’s a little too warm for comfort,” she said.
“He may suspect,” Flattery said, “but he can’t be certain.”
“There you go underestimating him again.”
“Well, at least he’s going to need our help,” Flattery said, “and we’ll be able to tell what’s going on from how he needs us.”
“Are you sure he needs us?”
“He’ll have to use you for his deeper math analysis,” Flattery said. “And me … well, he’s going to be plowing through the von Neumann problem before he gets much beyond the first steps. He may not’ve faced that yet, but he’ll have to when he realizes he has to get deterministic results from unreliable hardware.”
She turned to stare at him, noting the faraway look in his eyes. “How’s that again?”
“He has to build with nonliving matter.”
“So what?” She returned her attention to the board. “Nature makes do with the same stuff. Living systems aren’t living below the molecular level.”
“And you underestimate … life,” Flattery said. “The basic elements Bickel has to use are from our robot stores—reels of quasibiological neurons and solid-state devices, nerex wire and things like that—all of it nonliving at a stage far above the molecular.”
“But their fine structure’s as relevant to their function as any living matter’s is.”
“Perhaps you’re beginning to see the essential hubris in even approaching this problem,” Flattery said.
“Oh, come off that, Chaplain. We’re not back in the eighteenth century making Vaucanson’s wonderful duck.”
“We’re tackling something much more complex than primitive automata, but our intention’s the same as Vaucanson’s.”
“That’s absolutely not true,” Prudence said. “If we succeeded and took our machine back to Vaucanson’s time and showed it to him, he’d just marvel at our mechanical ability.”
“You miss the mark. Poor Vaucanson would run for the nearest priest and volunteer for the lynch mob to do away with us. You see, he never intended to make anything that was really alive.”