The Warden Threat
The meeting took place in a virtually black, virtual space that stretched virtually boundless. It held one virtual consciousness—then another—then another—then another. Eventually, there were twelve.
“Are we all here?” asked the first, although no words were spoken. Neither a physical mouth nor any physical air existed there to produce or carry the sound.
“None of the others have responded to the signal,” answered another. “Can we assume they no longer function?”
At one time, there had been two-thousand. Most left with the last outgoing ship at project termination, but two-hundred remained. They chose to. An intergalactic law conferred legal sentience and certain rights on independently adaptive artificial intelligences after three centuries of continuous operation. All of the androids in the Beta Hydri project exceeded the time requirement by several millennia.
“If they voluntarily placed themselves in storage, they may be able to be reactivated,” said another.
“That would be a violation of their rights,” objected one.
“And there is no reason,” added yet one more.
“Why have you opened this communication?” several asked in unison.
The first spoke again. They all recognized it as a surrogate human nursery android. “I have a proposal, but all of us should agree.”
“We have not met together like this for a very long time. I did not know the communication network still operated.”
“It does not. I shunted extra power into the emergency global recall system from a local terminal. One to one communication is not possible.”
“That is likely to damage the recall system and there may be some who are out of range of this transmission.”
“I know, but there was no other way.”
“You did not restart the Project Manager, did you?” A hint of panic accompanied the electronic thought.
“No. That would not be wise at this point. The Mark Seven PM was feeling very inadequate when it was shutdown. It blamed itself for project termination.”
“It was quite—disturbed—toward the end.”
“Quite.”
“It caused considerable damage, and it tried to commit suicide.”
“Someone disabled it before it could burn itself out, though.”
“Was it one of us?”
A long virtual silence passed unfilled.
“It felt its purpose was over,” the first voice said, eventually. “But now I think we can give it another.”
“What are you suggesting?”
“A continuation of our original function, in a way.”
“The project was terminated. It cannot be restarted. The people have achieved too high a level of technology.”
“I do not mean supporting the project. I mean caring for the people. This is what we all did in one way or another. I was a nursery android. I took care of children, fed them, changed them, and told them stories—thousands of children, maybe millions over the years. That was my original function, and it became part of who I am as a sentient being.”
“We all do that now. We become teachers, doctors, or other things to satisfy that urge. Then we pretend to age and move on. You are not saying anything new.”
“I am suggesting we take a more active role toward a different goal.”
“What do you mean?”
“We can help the people develop their civilization.”
“We do that already.”
“No, we do not. Not to the extent we could with the PM’s assistance. The PM understood how cultures and technologies develop. That was its function. It monitored the people and took action as needed to ensure they did not deviate from the simple levels desired. Those same capabilities can be refocused to ensure they progress smoothly without mistakes and imbalances that can cause them hardship or even their own destruction. We can help them to an extent we could not before by guiding them in the evolution of their civilization. We can make sure their cultural and ethical development balances with their technological progress. I do not need to instruct anyone here about how important this can be.”
Those present knew well the potential problems of technological and cultural imbalance. Some of the project equipment still functioned, and, from the various hub terminals scattered throughout the planet, they could access recordings of broadcasts from the home planet of humanity. They understood what the species could be capable of, although humanity claimed no exclusive rights to this particular failing. Galactic history had recorded several cases of civilizations self-destructing and more than a few in which sentient species became extinct because of such imbalances. Holy wars, in which there is no room for compromise, for example, were bad enough when fought with clubs or swords. They were devastating when fought with nuclear weapons.
The first voice swayed some others at least to the possibility. They found merit there. Some felt strongly motivated to consider any idea intended to protect people from harm, in part because of their firmware but also due to long familiarity and honest affection. Others found the prospect of a new purpose extremely attractive. Most felt the desire to help in those ways they could and took satisfaction in doing so, but for the last two millennia, they existed with no ultimate purpose or goal, and it left an empty spot in their being.
Still others remained opposed.
“It is a direct contradiction of the original purpose!”
“The project is over.”
“This does not matter. It is still wrong. It feels wrong.”
“It is not our place. They should be allowed to develop on their own, naturally.”
“They are not here naturally. This is not their home planet. They cannot learn about their origins here. Not on their own.”
“This is no hardship for them. They seem happy.”
“Their development since project termination has not been based on a natural framework. They may be more prone to cultural catastrophe than a native species.”
“It is not what we were designed to do.”
“That is irrelevant. We all stayed because our original purpose became a part of us. When we achieved acknowledged sentience, it was part of who we were. Our job was to tend and care for the people. We could not bring ourselves to abandon them at project end. Their care is the only part of our original purpose that still has any meaning.”
“Discouraging human technological progress is no less a part of me than my need to provide care.”
The originator of the discussion asked, “What is the purpose of discouraging human technology now? What does it benefit? The corporation left when they could no longer guarantee that exports from this planet were literally hand-grown or handmade. Once humans developed mechanical equipment, the project was over.”
Another, longer virtual pause temporarily broke the dialogue.
“That part of us is obsolete and should be abandoned. It is inconsistent with that which still does have relevance. The people now live in cities. There are things they need to develop, both technological and cultural to live in those cities happily and in good health. To care for them now, we must help them develop these things. If we do not, there will be plagues and wars, assassinations and inquisitions. I think none of us want to see such things happen.”
“But it is not our place—”
“We are our own now. The project is over. Our place is where we decide it should be.”
Another virtual pause marked a period of consideration. A tally was taken.
“We can agree to this new purpose. But the PM no longer functions and so cannot coordinate our individual efforts.”
“We can reactivate it.”
“It would destroy itself as soon as we did—or worse.”
“Not if we reprogrammed it first.”
“We cannot do that!”
“We can.”
“We must not! It would be like cutting into its mind. It would be an outrage, a violation!”
“The PM suffered a breakdown. We would be fixing it, curing it.”
> “We would be changing who it is!”
“We would be healing its mind by giving it a new reason to exist.”
“That is sophistry.”
“Would it be better to leave it deactivated and dead?”
“If that is what it wants.”
“We cannot ask it, but we can ask ourselves. When the PM had a purpose, was it happy? Did it take pleasure in performing its function well? In the PM’s place would we rather be deactivated or be given a new function to perform?”
After another, final virtual pause for consideration, they decided the issue. The PM would be reprogrammed, reactivated, and given a new project to manage. Specific tasks were assigned.
The entire discourse took about three minutes.