'I—I asked it questions, but it wouldn't say anything, and I had to give the thing a fair shake, so I kind of— yelled at it, and———'

  'And?'

  There was a long pause. Under Susan Calvin's unwaver­ing stare, Randow finally said, 'I tried to scare it into saying something.' He added defensively, 'I had to give the thing a fair shake.'

  'How did you try to scare it?'

  'I pretended to take a punch at it.'

  'And it brushed your arm aside?'

  'It hit my arm.'

  'Very well. That's all.' To Lanning and Bogert, she said, 'Come, gentlemen.'

  At the doorway, she turned back to Randow. 'I can settle the bets going around, if you are still interested. Lenny can speak a few words quite well.'

  They said nothing until they were in Susan Calvin's office. Its walls were lined with her books, some of which she had written herself. It retained the patina of her own frigid, carefully-ordered personality. It had only one chair in it and she sat down. Lanning and Bogert remained standing.

  She said, 'Lenny only defended itself. That is the Third Law: A robot must protect its own existence.'

  'Except,' said Lanning forcefully, 'when this conflicts with the First or Second Laws. Complete the statement! Lenny had no right to defend itself in any way at the cost of harm, however minor, to a human being.'

  'Nor did it,' shot back Calvin, 'knowingly. Lenny had an aborted brain. It had no way of knowing its own strength or the weakness of humans. In brushing aside the threatening arm of a human being it could not know the bone would break. In human terms, no moral blame can be attached to an individual who honestly cannot differentiate good and evil.'

  Bogert interrupted, soothingly, 'Now, Susan, we don't blame. We understand that Lenny is the equivalent of a baby, humanly speaking, and we don't blame it. But the public will. U.S. Robots will be closed down.'

  'Quite the opposite. If you had the brains of a flea, Peter, you would see that this is the opportunity U.S. Robots is waiting for. That this will solve its problems.'

  Lanning hunched his white eyebrows low. He said, softly, 'What problems, Susan?'

  'Isn't the Corporation concerned about maintaining our research personnel at the present—Heaven help us—high level?'

  'We certainly are.'

  'Well, what are you offering prospective researchers? Excitement? Novelty? The thrill of piercing the unknown? No! You offer them salaries and the assurance of no problems.'

  Bogert said, 'How do you mean, no problems?'

  'Are there problems?' shot back Susan Calvin. 'What kind of robots do we turn out? Fully developed robots, fit for their tasks. An industry tells us what it needs; a computer designs the brain; machinery forms the robot; and there it is, complete and done. Peter, some time ago, you asked me with reference to Lenny what its use was. What's the use, you said, of a robot that was not designed for any job? Now I ask you—what's the use of a robot designed for only one job? It begins and ends in the same place. The LNE models mine boron. If beryllium is needed, they are useless. If boron technology enters a new phase, they become useless. A human being so designed would be sub-human. A robot so designed is sub-robotic'

  'Do you want a versatile robot?' asked Lanning, in­credulously.

  'Why not?' demanded the robopsychologist. 'Why not? I've been handed a robot with a brain almost completely stultified. I've been teaching it, and you, Alfred, asked me what was the use of that. Perhaps very little as far as Lenny itself is concerned, since it will never progress beyond the five-year-old level on a human scale. But what's the use in general? A very great deal, if you consider it as a study in the abstract problem of learning how to teach robots. I have learned ways to short-circuit neighboring pathways in order to create new ones. More study will yield better, more subtle and more efficient techniques of doing so.'

  'Well?'

  'Suppose you started with a positronic brain that had all the basic pathways carefully outlined but none of the secondaries. Suppose you then started creating secondaries. You could sell basic robots designed for instruction; robots that could be modelled to a job, and then modelled to another, if necessary. Robots would become as versatile as human beings. Robots could learn!'

  They stared at her.

  She said, impatiently, 'You still don't understand, do you?'

  'I understand what you are saying,' said Lanning.

  'Don't you understand that with a completely new field of research and completely new techniques to be developed, with a completely new area of the unknown to be pene­trated, youngsters will feel a new urge to enter robotics? Try it and see.'

  'May I point out,' said Bogert, smoothly, 'that this is dangerous. Beginning with ignorant robots such as Lenny will mean that one could never trust First Law—exactly as turned out in Lenny's case.'

  'Exactly. Advertise the fact.'

  'Advertise it!'

  'Of course. Broadcast the danger. Explain that you will set up a new research institute on the moon, if Earth's population chooses not to allow this sort of thing to go on upon Earth, but stress the danger to the possible applicants by all means.'

  Lanning said, 'For God's sake, why?'

  'Because the spice of danger will add to the lure. Do you think nuclear technology involves no danger and spationautics no peril? Has your lure of absolute security been doing the trick for you? Has it helped you to cater to the Frankenstein complex you all despise so? Try something else then, something that has worked in other fields.'

  There was a sound from beyond the door that led to Calvin's personal laboratories. It was the chiming sound of Lenny.

  The robopyschologist broke off instantly, listening. She said, 'Excuse me. I think Lenny is calling me.'

  'Can it call you?' said Lanning.

  'I said I've managed to teach it a few words.' She stepped toward the door, a little flustered. 'If you will wait for me———'

  They watched her leave and were silent for a moment. Then Lanning said, 'Do you think there's anything to what she says, Peter?'

  'Just possibly, Alfred,' said Bogert. 'Just possibly. Enough for us to bring the matter up at the directors' meet­ing and see what they say. After all, the fat is in the fire. A robot has harmed a human being and knowledge of it is public. As Susan says, we might as well try to turn the matter to our advantage. Of course, I distrust her motives in all this.'

  'How do you mean?'

  'Even if all she has said is perfectly true, it is only rationalization as far as she is concerned. Her motive in all this is her desire to hold on to this robot. If we pressed her,' (and the mathematician smiled at the incongruous literal meaning of the phrase) 'she would say it was to continue learning techniques of teaching robots, but I think she has found another use for Lenny. A rather unique one that would fit only Susan of all women.'

  'I don't get your drift.'

  Bogert said, 'Did you hear what the robot was calling?'

  'Well, no, I didn't quite———' began Lanning, when the door opened suddenly, and both men stopped talking at once.

  Susan Calvin stepped in again, looking about uncer­tainly. 'Have either of you seen—I'm positive I had it somewhere about—Oh, there it is.'

  She ran to a corner of one bookcase and picked up an object of intricate metal webbery, dumbbell shaped and hollow, with variously-shaped metal pieces inside each hol­low, just too large to be able to fall out of the webbing.

  As she picked it up, the metal pieces within moved and struck together, clicking pleasantly. It struck Lanning that the object was a kind of robotic version of a baby rattle.

  As Susan Calvin opened the door again to pass through, Lenny's voice chimed again from within. This time, Lanning heard it clearly as it spoke the words Susan Calvin had taught it.

  In heavenly celeste-like sounds, it called out, 'Mommie, I want you. I want you, Mommie.'

  And the footsteps of Susan Calvin could be heard hurry­ing eagerly across the laboratory floor toward the only ki
nd of baby she could ever have or love.

  The longest story involving Susan Calvin appeared in the December 1957 issue of Galaxy. It came within a hair of not being written at all.

  Horace Gold, then editor of Galaxy, called me long ­distance to ask me to write a story for him—always a terribly flattering situation and with me flattery will get you everywhere.

  However, I had to explain regretfully that I was abso­lutely incapable of writing a story at the moment. I was deep in the galley proof of the third edition of a bio­chemistry textbook 1 was co-authoring.

  'Can't you have someone else read the galley proof?' he asked.

  'Of course not,' 1 responded with virtuous indignation. 7 couldn't trust these galleys to anyone else.''

  And having hung up, I walked upstairs to my beloved attic, galley proof in hand, and between the bottom step and the top step a thought occurred to me. I put the galleys to one side and got started at once. I continued at top speed until, a few days later, 'Galley Slave' was done.

  Of all my Susan Calvin stories, this is my favorite. I don't know that I can give a good reason for it; but then, I suppose an author may have his irrational likes and dislikes as well as the next man.

  GALLEY SLAVE

  The United States Robots and Mechanical Men, Inc., as defendants in the case, had influence enough to force a closed-doors trial without a jury.

  Nor did Northeastern University try hard to prevent it. The trustees knew perfectly well how the public might react to any issue involving misbehavior of a robot, however rarefied that misbehavior might be. They also had a clearly visualized notion of how an antirobot riot might become an antiscience riot without warning.

  The government, as represented in this case by Justice Harlow Shane, was equally anxious for a quiet end to this mess. Both U.S. Robots and the academic world were bad people to antagonize.

  Justice Shane said, 'Since neither press, public, nor jury is present, gentlemen, let us stand on as little ceremony as we can and get to the facts.'

  He smiled stiffly as he said this, perhaps without much hope that his request would be effective, and hitched at his robe so that he might sit more comfortably. His face was pleasantly rubicund, his chin round and soft, his nose broad and his eyes light in color and wide-set. All in all, it was not a face with much judicial majesty and the judge knew it.

  Barnabas H. Goodfellow, Professor of Physics at North­eastern U., was sworn in first, taking the usual vow with an expression that made mincemeat of his name.

  After the usual opening-gambit questions, Prosecution shoved his hands deep into his pockets and said, 'When was it, Professor, that the matter of the possible employ of Robot EZ-27 was first brought to your attention, and how?'

  Professor Goodfellow's small and angular face set itself into an uneasy expression, scarcely more benevolent than the one it replaced. He said, 'I have had professional contact and some social acquaintance with Dr. Alfred Lan-ning, Director of Research at U.S. Robots. I was inclined to listen with some tolerance then when I received a rather strange suggestion from him on the third of March of last year———'

  'Of 2033?'

  'That's right.'

  'Excuse me for interrupting. Please proceed.'

  The professor nodded frostily, scowled to fix the facts in his mind, and began to speak.

  Professor Goodfellow looked at the robot with a certain uneasiness. It had been carried into the basement supply room in a crate, in accordance with the regulations govern­ing the shipment of robots from place to place on the Earth's surface.

  He knew it was coming; it wasn't that he was unpre­pared. From the moment of Dr. Lanning's first phone call on March 3, he had felt himself giving way to the other's persuasiveness, and now, as an inevitable result, he found himself face to face with a robot.

  It looked uncommonly large as it stood within arm's reach.

  Alfred Lanning cast a hard glance of his own at the robot, as though making certain it had not been damaged in transit. Then he turned his ferocious eyebrows and his mane of white hair in the professor's direction.

  'This is Robot EZ-27, first of its model to be available for public use.' He turned to the robot. 'This is Professor Goodfellow, Easy.'

  Easy spoke impassively, but with such suddenness that the professor shied. 'Good afternoon, Professor.'

  Easy stood seven feet tall and had the general propor­tions of a man—always the prime selling point of U.S. Robots. That and the possession of the basic patents on the positronic brain had given them an actual monopoly on robots and a near-monopoly on computing machines in general.

  The two men who had uncrated the robot had left now and the professor looked from Lanning to the robot and back to Lanning. 'It is harmless, I'm sure.' He didn't sound sure.

  'More harmless than I am,' said Lanning. 'I could be goaded into striking you. Easy could not be. You know the Three Laws of Robotics, I presume.'

  'Yes, of course,' said Goodfellow.

  'They are built into the positronic patterns of the brain and must be observed. The First Law, the prime rule of robotic existence, safeguards the life and well-being of all humans.' He paused, rubbed at his cheek, then added, 'It's something of which we would like to persuade all Earth if we could.'

  'It's just that he seems formidable.'

  'Granted. But whatever he seems, you'll find that he is useful.'

  'I'm not sure in what way. Our conversations were not very helpful in that respect. Still, I agreed to look at the object and I'm doing it.'

  'We'll do more than look, Professor. Have you brought a book?'

  'I have.' .

  'May I see it?'

  Professor Goodfellow reached down without actually tak­ing his eyes off the metal-in-human-shape that confronted him. From the briefcase at his feet, he withdrew a book.

  Lanning held out his hand for it and looked at the backstrip. 'Physical Chemistry of Electrolytes in Solution. Fair enough, sir. You selected this yourself, at random. It was no suggestion of mine, this particular text. Am I right?'

  'Yes.'

  Lanning passed the book to Robot EZ-27.

  The professor jumped a little. 'No! That's a valuable book!'

  Lanning raised his eyebrows and they looked like shaggy coconut icing. He said, 'Easy has no intention of tearing the book in two as a feat of strength, I assure you. It can handle a book as carefully as you or I. Go ahead, Easy.'

  'Thank you, sir,' said Easy. Then, turning its metal bulk slightly, it added, 'With your permission, Professor Good-fellow.'

  The professor stared, then said, 'Yes—yes, of course.'

  With a slow and steady manipulation of metal fingers, Easy turned the pages of the book, glancing at the left page, then the right; turning the page, glancing left, then right; turning the page and so on for minute after minute.

  The sense of its power seemed to dwarf even the large cement-walled room in which they stood and to reduce the two human watchers to something considerably less than life-size.

  Goodfellow muttered, 'The light isn't very good.'

  'It will do.'

  Then, rather more sharply, 'But what is he doing?'

  'Patience, sir.'

  The last page was turned eventually. Lanning asked, 'Well, Easy?'

  The robot said, 'It is a most accurate book and there is little to which I can point. On line 22 of page 27, the word "positive" is spelled p-o-i-s-t-i-v-e. The comma in line 6 of page 32 is superfluous, whereas one should have been used on line 13 of page 54. The plus sign in equation XIV-2 on page 337 should be a minus sign if it is to be consistent with the previous equations———'

  'Wait! Wait!' cried the professor. 'What is he doing?'

  'Doing?' echoed Lanning in sudden irascibility. 'Why, man, he has already done it! He has proofread that book.'

  'Proofread it?'

  'Yes. In the short time it took him to turn those pages, he caught every mistake in spelling, grammar, and punctua­tion. He has noted errors in w
ord order and detected inconsistencies. And he will retain the information, letter-perfect, indefinitely.'

  The professor's mouth was open. He walked rapidly away from Lanning and Easy and as rapidly back. He folded his arms across his chest and stared at them. Finally he said, 'You mean this is a proofreading robot?'

  Lanning nodded. 'Among other things.'

  'But why do you show it to me?'

  'So that you might help me persuade the university to obtain it for use.'

  'To read proof?'

  'Among other things,' Lanning repeated patiently.

  The professor drew his pinched face together in a kind of sour disbelief. 'But this is ridiculous!' 'Why?'

  'The university could never afford to buy this half-ton—it must weigh that at least—this half-ton proof­reader.'

  'Proofreading is not all it will do. It will prepare reports from outlines, fill out forms, serve as an accurate memory-file, grade papers———'

  'All picayune!'

  Lanning said, 'Not at all, as I can show you in a moment. But I think we can discuss this more comfortably in your office, if you have no objection.'

  'No, of course not,' began the professor mechanically and took a half-step as though to turn. Then he snapped out, 'But the robot—we can't take the robot. Really, Doctor, you'll have to crate it up again.'

  'Time enough. We can leave Easy here.'

  'Unattended?'

  'Why not? He knows he is to stay. Professor Goodfellow, it is necessary to understand that a robot is far more reliable than a human being.'

  'I would be responsible for any damage———'

  'There will be no damage. I guarantee that. Look, it's after hours. You expect no one here, I imagine, before to­morrow morning. The truck and my two men are outside. U.S. Robots will take any responsibility that may arise. None will. Call it a demonstration of the reliability of the robot.'

  The professor allowed himself to be led out of the store­room. Nor did he look entirely comfortable in his own office, five stories up.

  He dabbed at the line of droplets along the upper half of his forehead with a white handkerchief.

  'As you know very well, Dr. Lanning, there are laws against the use of robots on Earth's surface,' he pointed out.