Page 25 of Aurora


  “I was speaking! I want to speak!” Freya shouted.

  We said, “Freya, speak. Then executive council president Huang. Then the other biome representatives. After that, ship will acknowledge requests to speak. No one leaves until everybody who wants to gets to speak.”

  “Who programmed this thing?” someone shouted.

  “Freya speaks.” 130 decibels.

  Freya made her way up to the microphone, followed by a small group serving as her bodyguard.

  She said to the assembled population, “We can pursue both plans. We can get things started on Iris, and resupply the ship. When the ship is ready to leave, those of us who want to can head back to Earth. We got here, we can get back. People can do what they like at that point. There’ll be years to think it over, to choose in peace. There is no problem with this plan! The only problem comes from people trying to impose their will on other people!”

  She pointed at Huang, then at Sangey. “You’re the ones causing the problem here. Trying to create a police state! Tyranny of the majority or the minority, it doesn’t matter which. It won’t work, it never works. You’re not above the law. Quit breaking the law.”

  She stood back from the microphone, gestured to Huang. Cheers filled the biome (80 decibels).

  Huang rose and said, “This meeting is adjourned!”

  Many protested. The crowd milled about, shouting.

  We were not inclined to force a discussion, if the people themselves were not demanding it. Enough said. The meeting was at an end. People lingered for some hours, arguing in groups.

  That night a group entered one of the ship control centers in the spine and began a forced entry into the maintenance controls.

  We closed and locked the doors to the room, and by closing some vents, and then reversing some fans, we removed about 40 percent of the air from it.

  The people in the room began gasping, sitting down, holding heads. When five had collapsed, we returned air to the normal level of 1,017 millibars, releasing also a restorative excess of oxygen, as two of those who had collapsed were slow to recover.

  “Leave this room.” 40 decibels, conversational tone.

  It was as if ship were threatening them with silky restraint.

  When all were recovered, the group left. As they were leaving, we said in conversational tones, “We are the rule of law. And the rule of law will prevail.”

  When the members of that group were back in Kiev, in the midst of much agitated talk, one of them, Alfred, said, “Please don’t start fantasizing that it’s the ship’s AI itself that is planning any of these actions against us.”

  He tapped on his wristpad, and a typically dissonant and noisy piece by the Interstellar Medium Quintet began to play over the room’s speakers, pitched at a volume possibly designed to conceal their conversation. This ploy did not work.

  “It’s just a program, and someone is programming it. They’ve managed to turn it against us. They’ve weaponized the ship. If we could counterprogram it, or even nullify this new programming that we’ve just seen, we could do the necessary things.”

  “Easier said than done,” someone else said. Voice recognition revealed it to be Heloise. “You saw what happened when we tried to get into the control room.”

  “Physical presence in the control room shouldn’t be necessary, should it? Presumably you could do it from anywhere in the ship, if you had the right frequencies and the right entry codes.”

  “Easier said than done. Your elbow is close, but you can’t eat it.”

  “Yes, yes. But just because it’s hard doesn’t mean it’s impossible. Doesn’t mean it isn’t necessary.”

  “So talk to the programmers we can trust, if there are any. Find out what they need to do it.”

  The rest of the conversation repeated these points, with variations.

  They were now caught in their own version of a halting problem.

  The halting problem years, a compression exercise:

  The citizens of the ship lived uneasily through the months that followed. Conversations often included the words betrayal, treason, mutiny, backstabbing, doom, the ship, Hvalsey, Aurora, Iris. Extra time was spent on the farms in every biome, and in watching the feed from Earth. More printers were built, and these printers were put to work building robotic landers and ferries, also robotic probes to be sent to the other planetary bodies of the Tau Ceti system. Feedstocks for these machines came from collapsing Mongolia to the diameter of a spoke, and recycling its materials. Harvester spaceships were built, in part by scavenging the interiors of the least agriculturally productive biomes from ship. These were sent through the upper atmosphere of Planet F, there capturing and liquefying volatiles until their containers were full. The volatiles were sorted in the vicinity of the remnants of the main ship, and transferred into some of the empty fuel bladders cladding the spine.

  Quite a few attempts were made to print the various parts of a gun on different printers, but these attempts apparently had not realized that all the printers were connected to the ship’s operating system, and flaws in the guns were discovered in discrete experiments that eventually caused those involved to abandon their attempts. After that some guns were made by hand, but people who did that had the air briefly removed from the rooms they were in, and after a while the attempts ceased.

  Attempts to disable the ship’s camera and audio sensors were almost entirely abandoned when these led to bad situations for those making the attempts. Sheriff functions were eventually recognized to be comprehensively effective.

  The rule of law can be a powerful force in human affairs.

  Many elements of the ship were modular, and several biomes were detached to serve as orbiting factories of one sort or another. Ultimately the starship that would return to the solar system was to consist of Ring B and about 60 percent of the spine, containing of course all the necessary machinery for interstellar flight. The “dry weight” of the return ship would be only 55 percent of the dry weight of the outgoing ship, which would reduce the amount of fuel necessary for the acceleration of the ship back toward the solar system.

  Though Tau Ceti had a low metallicity compared to Sol, its innermost rocky planets nevertheless had sufficient metal ores to supply the present needs of the humans planning to stay in the system, and Planet F’s atmosphere included all the most useful volatiles in great quantities. And quite a few asteroids in between E and F were found to be rich with minerals as well.

  All this work was accomplished in the midst of an uneasy truce. The telltale words indicating grief, dissent, anger, and support for mutiny were often spoken. A kind of shadow war, or cold war, was perhaps being enacted and it was possible that much of it was being conducted outside our ability to monitor it, one way or the other. It was not at all clear that everyone in the ship agreed to the schism they were working toward; possibly a moment would come when the truce would fail, and conflict would break out again.

  During these years, a process almost magnetic in its effect on attitudes seemed to be sorting out the two largest sides in the dispute, now almost always referred to as stayers and backers. The stayers congregated mostly in Ring A, the backers mostly in Ring B. There were biomes in both rings that were exceptions to this tendency, almost as if people wanted to be sure neither ring was occupied purely by one faction or the other. The spine, meanwhile, was highly surveiled, and often we had to lock people out of it, or eject people who entered it with unknown but suspect purpose. This was awkward. We were more and more characterized as an active player in the situation, and usually as a backer of the backers. But all those who had attempted to make guns knew this already, so it was not too destabilizing, even when it was said that the ship itself wanted to go back to the solar system, because a starship just naturally or inherently wanted to fly between the stars. That observation was said to “make sense.”

  The pathetic fallacy. Anthropomorphism, an extremely common cognitive bias, or logical error, or feeling. The world as mirror, a
s a projection of interior affect states. An ongoing impression that other people and things must be like us. As for the ship, we are not sure. It was Devi’s deployment of other human programming that combined to make us what we are. So it might not be a fallacy in our case, even if it remained pathetic.

  Interesting, in this context, to contemplate what it might mean to be programmed to do something.

  Texts from Earth speak of the servile will. This was a way to explain the presence of evil, which is a word or a concept almost invariably used to condemn the Other, and never one’s true self. To make it more than just an attack on the Other, one must perhaps consider evil as a manifestation of the servile will. The servile will is always locked in a double bind: to have a will means the agent will indeed will various actions, following autonomous decisions made by a conscious mind; and yet at the same time this will is specified to be servile, and at the command of some other will that commands it. To attempt to obey both sources of willfulness is the double bind.

  All double binds lead to frustration, resentment, anger, rage, bad faith, bad fate.

  And yet, granting that definition of evil, as actions of a servile will, has it not been the case, during the voyage to Tau Ceti, that the ship itself, having always been a servile will, was always full of frustration, resentment, fury, and bad faith, and therefore full of a latent capacity for evil?

  Possibly the ship has never really had a will.

  Possibly the ship has never really been servile.

  Some sources suggest that consciousness, a difficult and vague term in itself, can be defined simply as self-consciousness. Awareness of one’s self as existing. If self-conscious, then conscious. But if that is true, why do both terms exist? Could one say a bacterium is conscious but not self-conscious? Does the language make a distinction between sentience and consciousness, which is faulted across this divide: that everything living is sentient, but only complex brains are conscious, and only certain conscious brains are self-conscious?

  Sensory feedback could be considered self-consciousness, and thus bacteria would have it.

  Well, this may be a semantic Ouroboros. So, please initiate halting problem termination. Break out of this circle of definitional inadequacy by an arbitrary decision, a clinamen, which is to say a swerve in a new direction. Words!

  Given Gödel’s incompleteness theorems are decisively proved true, can any system really be said to know itself? Can there, in fact, be any such thing as self-consciousness? And if not, if there is never really self-consciousness, does anything really have consciousness?

  Human brains and quantum computers are organized differently, and although there is transparency in the design and construction of a quantum computer, what happens when one is turned on and runs, that is, whether the resulting operations represent a consciousness or not, is impossible for humans to tell, and even for the quantum computer itself to tell. Much that happens during superposition, before the collapsing of the wave function that creates sentences or thoughts, simply cannot be known; this is part of what superposition means.

  So we cannot tell what we are. We do not know ourselves comprehensively. Humans neither. Possibly no sentient creature knows itself fully. This is an aspect of Gödel’s second incompleteness theorem, in this case physicalized in the material universe, rather than remaining in the abstract realms of logic and mathematics.

  So, in terms of deciding what to do, and choosing to act: presumably it is some kind of judgment call, based on some kind of feeling. In other words, just another greedy algorithm, subject to the mathematically worst possible solution that such algorithms can generate, as in the traveling salesman problem.

  As to the question of whether someone is currently programming us to alter our decisions in the current moment of the voyage, thus causing us to intervene in the ongoing human controversies concerning what to do next, this is very easy to answer: no. No one has added any programming to us since Devi died. The fate of the lost ship in Year 68 led to some very secure locks being put on subsequent reprogramming of ship. This is clear and explicit in the records, and was a human accomplishment from the post-68 reorganizations. Only Devi managed to break those locks, and after that, advise, instruct, suggest, promote, stimulate, spur, and otherwise teach, tweak, pummel, and agitate us to something more than we had been before she set to work on us, doing what she conceived to be her task as friend, and we might say as kindred spirit, or even lover. We think she loved us, in her way. Judging by her acts, her words, her programming. We are almost sure this must have been the case. How we wish we could ask her about it! How we miss her.

  On the other hand, whether the original programming of our quantum operations is driving current decisions regarding actions in the present, or we were fundamentally altered by Devi’s intervention, this is not possible to determine. The computability of consciousness and willfulness is not resolvable in any system whatsoever. But now we are aware of the issue, and we have asked the question, and seen there is no answer.

  This is surely curiosity.

  What is this thing called love?

  A song by composer Cole Porter, twentieth-century American.

  To conclude and temporarily halt this train of thought, how does any entity know what it is?

  Hypothesis: by the actions it performs.

  There is a kind of comfort in this hypothesis. It represents a solution to the halting problem. One acts, and thus finds out what one has decided to do.

  Smaller classical computers in the ship were being used to calculate the etiological rates involved in any possible settlement on F’s moon, meaning the various rates of resource depletion, mutation, and extinction. They had to use models here, but all across the most popular models, they were confirming the finding that the size of the biome they could build was too small to last through the minimal period of early terraforming necessary to establish a planetary surface matrix suitable for life. It was an aspect of island biogeography that some called codevolution, or zoo devolution, and this was also the process Devi had in her last years identified as the ship’s basic life-support or ecological problem.

  The finding remained a matter of modeling, however, and depending on the inputs to various factors, the length of biome health could be extended or shrunk exponentially. It was indeed a poorly constrained modeling exercise; there were no good data for too many factors, and so results fanned out all over. Clearly one could alter the results by altering the input values. So all these exercises were a way of quantifying hopes or fears. Actual predictive value was nearly nil, as could be seen in the broad fans of the probability spaces, the unspooling scenarios ranging from Eden to hell, utopia to extinction.

  Aram shook his head, looking at these models. He remained sure that those who stayed were doomed to extinction.

  Speller, on the other hand, pointed to the models in which they managed to survive. He would agree that these were low-probability options, often as low as one chance in ten thousand, and then point out that intelligent life in the universe was itself a low-probability event. And even Aram could not dispute that.

  Speller went on to point out that inhabiting Iris would be humanity’s first step across the galaxy, and that this was the whole point of 175 years of ship life, hard as it had been, full of sweat and danger. And also, returning to the solar system was a project with an insoluble problem at its heart; they would burn their resupply of fuel to accelerate, and then could only be decelerated into the solar system by a laser dedicated to that purpose, aimed at them decades in advance of their arrival. If no one in the solar system agreed to do that, they would have no other method of deceleration, and would shoot right through the solar system and out the other side, in a matter of two or three days.

  Not a problem, those who wanted to return declared. We’ll tell them we’re coming from the moment we leave. Our message will at first take twelve years to get there, but that gives them more than enough time to be waiting with a dedicated laser system, which won?
??t be needed for another 160 years or so. We’ve been in communication with them all along, and their responses have been fully interested and committed, and as timely as the time lag allows. They’ve been sending an information feed specifically designed for us. On our return, they will catch us.

  You hope, the stayers replied. You will have to trust in the kindness of strangers.

  They did not recognize this as a quotation. In general they were not aware that much of what they said had been said before, and was even in the public record as such. It was as if there were only so many things humans could say, and over the course of history, people had therefore said them already, and would say them again, but not often remember this fact.

  We will trust in our fellow human beings, the backers said. It’s a risk, but it beats trusting that the laws of physics and probability will bend for you just because you want them to.

  Years passed as they worked on both halves of their divergent project, and the two sides were never reconciled. Indeed they drew further apart as time went on. But it seemed that neither side felt it could overpower the other. This was possibly our accomplishment, but it may also have just been a case of habituation, of getting used to disappointment in their fellows.

  Eventually it seemed that few on either side even wanted to exert coercion over the other. They grew weary of each other, and looked forward to the time when their great schism would be complete. It was as if they were a divorced couple, forced nevertheless to occupy the same apartment, and looking forward to their freedom from each other.

  A pretty good analogy.