Page 13 of The Robots of Dawn


  They were moving toward the house up ahead, presumably the house of Jander's quasi-owner.

  Baley could hear the rustle of some animal in the grass to the right, the sudden chirrup of a bird somewhere in a tree behind him, the small unplaceable clatter of insects all about. These, he told himself, were all animals with ancestors that had once lived on Earth, They had no way of knowing that this patch of ground they inhabited was not all there was—forever and forever back in time. The very trees and grass had arisen from other trees and grass that had once grown on Earth.

  Only human beings could live on this world and know that they were not autochthonous but had stemmed from Earthmen—and yet did the Spacers really know it or did they simply put it out of their mind? Would the time come, perhaps, when they would not know it at all? When they would not remember which world they had come from or whether there was a world of origin at all?

  “Dr. Fastolfe,” he said suddenly, in part to break the chain of thought that he found to be growing oppressive, “you still have not told me your motive for the destruction of Jander.”

  “True! I have not! —Now why do you suppose, Mr. Baley, I have labored to work out the theoretical basis for the positronic brains of humaniform robots?”

  “I cannot say.”

  “Well, think. The task is to design a robotic brain as close to the human as possible and that would require, it would seem, a certain reach into the poetic—” He paused and his small smile became an outright grin. “You know it always bothers some of my colleagues when I tell them that, if a conclusion is not poetically balanced, it cannot be scientifically true. They tell me they don't know what that means.”

  Baley said, “I'm afraid I don't, either.”

  “But I know what it means. I can't explain it, but I feel the explanation without being able to put it into words, which may be why I have achieved results my colleagues have not. However, I grow grandiose, which is a good sign I should become prosaic. To imitate a human brain, when I know almost nothing about the workings of the human brain, needs an intuitive leap—something that feels to me like poetry. And the same intuitive leap that would give me the humaniform positronic brain should surely give me a new access of knowledge about the human brain itself. That was my belief—that through humaniformity I might take at least a small step toward the psychohistory I told you about.”

  “I see.”

  “And if I succeeded in working out a theoretical structure that would imply a humaniform positronic brain, I would need a humaniform body to place it in. The brain does not exist by itself, you understand. It interacts with the body, so that a humaniform brain in a nonhumaniform body would become, to an extent, itself nonhuman.”

  “Are you sure of that?”

  “Quite. You have only to compare Daneel with Giskard.”

  “Then Daneel was constructed as an experimental device for furthering the understanding of the human brain?”

  “You have it. I labored two decades at the task with Sarton. There were numerous failures that had to be discarded. Daneel was the first true success and, of course, I kept him for further study—and out oP—he grinned lop-sidedly, as though admitting to something silly— “affection. After all, Daneel can grasp the notion of human duty, while Giskard, with all his virtues, has trouble doing so. You saw.”

  “And DaneePs stay on Earth with me, three years ago, was his first assigned task?”

  “His first of any importance, yes. When Sarton was murdered, we needed something that was a robot and could withstand the infectious diseases of Earth and yet looked enough like a man to get around the antirobotic prejudices of Earth's people.”

  “An astonishing coincidence that Daneel should be right at hand at that time.”

  “Oh? Do you believe in coincidences? It is my feeling that any time at which a development as revolutionary as the humaniform robot came into being, some task that would require its use would present itself. Similar tasks had probably been presenting themselves regularly in all the years that Daneel did not exist—and because Daneel did not exist, other solutions and devices had to be used.”

  “And have your labors been successful, Dr. Fastolfe? Do you now understand the human brain better than you did?”

  Fastolfe had been moving more and more slowly and Baley had been matching his progress to the other's. They were now standing still, about halfway between Fastolfe's establishment and the other's. It was the most difficult point for Baley, since it was equally distant from protection in either direction, but he fought down the growing uneasiness, determined not to provoke Giskard. He did not wish by some motion or outcry—or even expression—to activate the inconvenience of Giskard's desire to save him. He did not want to have himself lifted up and carried off to shelter.

  Fastolfe showed no sign of understanding Baley's difficulty. He said, “There's no question but that advances in mentology have been carried through. There remain enormous problems and perhaps these will always remain, but there has been progress. Still—”

  “Still?”

  “Still, Aurora is not satisfied with a purely theoretical study of the human brain. Uses for humaniform robots have been advanced that I do not approve of.”

  “Such as the use on Earth.”

  “No, that was a brief experiment that I rather approved of and was even fascinated by. Could Daneel fool Earthpeople? It turned out he could, though, of course, the eyes of Earthmen for robots are not very keen. Daneel cannot fool the eyes of Aurorans, though I dare say future humaniform robots could be improved to the point where they would. There are other tasks that have been proposed, however.”

  “Such as?”

  Fastolfe gazed thoughtfully into the distance. “I told you this world was tame. When I began my movement to encourage a renewed period of exploration and settlement, it was not to the supercomfortable Aurorans—or Spacers generally—that I looked for leadership. I rather thought we ought to encourage Earthmen to take the lead. With their horrid world—excuse me—and short life-span, they have so little to lose, I thought that they would surely welcome the chance, especially if we were to help them technologically. I spoke to you about such a thing when I saw you on Earth three years ago. Do you remember?” He looked sidelong at Baley.

  Baley said stolidly, “I remember quite well. In fact, you started a chain of thought in me that has resulted in a small movement on Earth in that very direction.”

  “Indeed? It would not be easy, I imagine. There is the claustrophilia of you Earthmen, your dislike of leaving your walls.”

  “We are fighting it, Dr. Fastolfe. Our organization is planning to move out into space. My son is a leader in the movement and I hope the day may come when he leaves Earth at the head of an expedition to settle a new world. If we do indeed receive the technological help you speak of—” Baley let that dangle.

  “If we supplied the ships, you mean?”

  “And other equipment. Yes, Dr. Fastolfe.”

  “There are difficulties. Many Aurorans do not want Earthmen to move outward and settle new worlds. They fear the rapid spread of Earthish culture, its beehive Cities, its chaoticism.” He stirred uneasily and said, “Why are we standing here, I wonder? Let's move on.”

  He walked slowly forward and said, “I have argued that that would not be the way it would be. I have pointed out that the settlers from Earth would not be Earthmen in the classical mode. They would not be enclosed in Cities. Coming to a new world, they would be like the Auroran Fathers coming here. They would develop a manageable ecological balance and would be closer to Aurorans than to Earthmen in attitude.”

  “Would they not then develop all the weaknesses you find in Spacer culture, Dr. Fastolfe?”

  “Perhaps not. They would learn from our mistakes. —But that is academic, for something has developed which makes the argument moot.”

  “And what is that?”

  “Why, the humaniform robot. You see, there are those who see the humaniform robot as the perfect
settler. It is they who can build the new worlds.”

  Baley said, “You've always had robots. Do you mean this idea was never advanced before?”

  “Oh, it was, but it was always clearly unworkable. Ordinary nonhumaniform robots, without immediate human supervision, building a world that would suit their own nonhumaniform selves, could not be expected to tame and build a world that would be suitable for the more delicate and flexible minds and bodies of human beings.”

  “Surely the world they would build would serve as a reasonable first approximation.”

  “Surely it would, Mr. Baley. It is a sign of Auroran decay, however, that there is an overwhelming feeling among our people that a reasonable first approximation is unreasonably insufficient. —A group of humaniform robots, on the other hand, as closely resembling human beings in body and mind as possible, would succeed in building a world which, in suiting themselves, would also inevitably suit Aurorans. Do you follow the reasoning?”

  “Completely.”

  “They would build a world so well, you see, that when they are done and Aurorans are finally willing to leave, our human beings will step out of Aurora and into another Aurora. They will never have left home; they will simply have another newer home, exactly like the other one, in which to continue their decay. Do you follow that reasoning, too?”

  “I see your point, but I take it that Aurorans do not.”

  “May not. I think I can argue the point effectively, if the opposition does not destroy me politically via this matter of the destruction of Jander. Do you see the motive attributed to me? I am supposed to have embarked on a program of the destruction of humaniform robots rather then allow them to be used to settle other planets. Or so my enemies say.”

  It was Baley now who stopped walking. He looked thoughtfully at Fastolfe and said, “You understand, Dr. Fastolfe, that it is to Earth's interest that your point of view win out completely.”

  “And to your own interests as well, Mr. Baley.”

  “And to mine. But if I put myself to one side for the moment, it still remains vital to my world that our people be allowed, encouraged, and helped to explore the Galaxy; that we retain as much of our own ways as we are comfortable with; that we not be condemned to imprisonment on Earth forever, since there we can only perish.”

  Fastolfe said, “Some of you, I think, will insist on remaining imprisoned.”

  “Of course. Perhaps almost all of us will. However at least some of us—as many of us as possible—will escape if given permission. —It is therefore my duty, not only as a representative of the law of a large fraction of humanity but as an Earthman, plain and simple, to help you clear your name, whether you are guilty or innocent. Nevertheless, I can throw myself wholeheartedly into this task only if I know that, in fact, the accusations against you are unjustified.”

  “Of course! I understand.”

  “In the light, then, of what you have told me of the motive attributed to you, assure me once again that you did not do this thing.”

  Fastolfe said, “Mr. Baley, I understand completely that you have no choice in this matter. I am quite aware that I can tell you, with impunity, that I am guilty and that you would still be compelled by the nature of your needs and those of your world to work with me to mask that fact. Indeed, if I were actually guilty, I would feel compelled to tell you so, so that you could take that fact into consideration and, knowing the truth, work the more efficiently to rescue me—and yourself. But I cannot do so, because the fact is I am innocent. However much appearances may be against me, I did not destroy Jander. Such a thing never entered my mind.”

  “Never?”

  Fastolfe smiled sadly. “Oh, I may have thought once or twice that Aurora would have been better off if I had never worked out the ingenious notions that led to the development of the humaniform positronic brain—or that it would be better off if such brains proved unstable and readily subject to mental freeze-out. But those were fugitive thoughts. Not for a split second did I contemplate bringing about Jander's destruction for this reason.”

  “Then we must destroy this motive that they attribute to you.”

  “Good. But how?”

  “We could show that it serves no purpose. What good does it do to destroy Jander? More humanif orm robots can be built. Thousands. Millions.”

  “I'm afraid that's not so, Mr. Baley. None can be built. I alone know how to design them, and, as long as robot colonization is a possible destiny, I refuse to build any more. Jander is gone and only Daneel is left.”

  “The secret will be discovered by others.”

  Fastolfe's chin went up. “I would like to see the roboticist capable of it. My enemies have established a Robotics Institute with no other purpose than to work out the methods behind the construction of a humaniform robot, but they won't succeed. They certainly haven't succeeded so far and I know they won't succeed.”

  Baley frowned. “If you are the only man who knows the secret of the humaniform robots, and if your enemies are desperate for it, will they not try to get it out of you?”

  “Of course. By threatening my political existence, by perhaps maneuvering some punishment that will forbid my working in the field and thus putting an end to my professional existence as well, they hope to have me agree to share the secret with them. They may even have the Legislature direct me to share the secret on the pain of confiscation of property, imprisonment—who knows what? However, I have made up my mind to submit to anything—anything—rather than give in. But I don't want to have to, you understand.”

  “Do they know of your determination to resist?”

  “I hope so. I have told them plainly enough. I presume they think I'm bluffing, that I'm not serious. —But I am.”

  “But if they believe you, they might take more serious steps.”

  “What do you mean?”

  “Steal your papers. Kidnap you. Torture you.”

  Fastolfe broke into a loud laugh and Baley flushed. He said, “I hate to sound like a hyperwave drama, but have you considered that?”

  Fastolfe said, “Mr. Baley —First, my robots can protect me. It would take full-scale war to capture me or my work. Second, even if somehow they succeeded, not one of the roboticists opposed to me could bear to make it plain that the only way he could obtain the secret of the humaniform positronic brain is to steal it or force it from me. His or her professional reputation would be completely wiped out. Third, such things on Aurora are unheard of. The merest hint of an unprofessional attempt upon me would swing the Legislature—and public opinion—in my favor at once.”

  “Is that so?” muttered Baley, silently damning the fact of having to work in a culture whose details he simply didn't understand.

  “Yes. Take my word for it. I wish they would try something of this melodramatic sort. I wish they were so incredibly stupid as to do so. In fact, Mr. Baley, I wish I could persuade you to go to them, worm your way into their confidence, and cajole them into mounting an attack on my establishment or waylaying me on an empty road—or anything of the sort that, I imagine, is common on Earth.”

  Baley said stiffly, “I don't think that would be my style.”

  “I don't think so, either, so I have no intention of trying to implement my wish. And believe me, that is too bad, for if we cannot persuade them to try the suicidal method of force, they will continue to do something much better, from their standpoint. They will destroy me by falsehoods.”

  “What falsehoods?”

  “It is not just the destruction of one robot they attribute to me. That is bad enough and just might suffice. They are whispering—it is only a whisper as yet—that the death is merely an experiment of mine and a dangerous, successful one. They whisper that I am working out a system for destroying humaniform brains rapidly and efficiently, so that when my enemies do create their own humaniform robots, I, together with members of my party, will be able to destroy them all, thus preventing Aurora from settling new worlds and leaving the Galaxy to my Ear
thmen confederates.”

  “Surely there can be no truth in this.”

  “Of course not. I told you these are lies. And ridiculous lies, too. No such method of destruction is even theoretically possible and the Robotics Institute people are not on the point of creating their own humaniform robots. I cannot conceivably indulge in an orgy of mass destruction even if I wanted to. I cannot”

  “Doesn't the whole thing fall by its own weight, then?”

  “Unfortunately, it's not likely to do so in time. It may be silly nonsense, but it will probably last long enough to sway public opinion against me to the point of swinging just enough votes in the Legislature to defeat me. Eventually, it will all be recognized as nonsense, but by then it will be too late. And please notice that Earth is being used as a whipping boy in this. The charge that I am laboring on behalf of Earth is a powerful one and many will choose to believe the whole farrago, against their own better sense, because of their dislike of Earth and Earth-people.”

  Baley said, “What you're telling me is that active resentment against Earth is being built up.”

  Fastolfe said, “Exactly, Mr. Baley. The situation grows worse for me—and for Earth—every day and we have very little time.”

  “But isn't there an easy way of knocking this thing on its head?” (Baley, in despair, decided it was time to fall back on Daneel's point.) “If you were indeed anxious to test 2Lmethod for the destruction of a humaniform robot, why seek out one in another establishment, one with which it might be inconvenient to experiment? You had Daneel, himself, in your own establishment. He was at hand and convenient. Would not the experiment have been conducted upon him if there were any truth at all in the rumor?”

  “No, no,” said Fastolfe. “I couldn't get anyone to believe that. Daneel was my first success, my triumph. I wouldn't destroy him under any circumstances. Naturally, I would turn to Jander. Everyone would see that and I would be a fool to try to persuade them that it would have made more sense for me to sacrifice Daneel.”