The United States Robots and Mechanical Men Corporation, as defendants in the case, had influence enough to force a closed-doors trial without a jury.
Nor did Northeastern University try hard to prevent it. The trustees knew perfectly well how the public might react to any issue involving misbehavior of a robot, however rarefied that misbehavior might be. They also had a clearly visualized notion of how an antirobot riot might become an antiscience riot without warning.
The government, as represented in this case by Justice Harlow Shane, was equally anxious for a quiet end to this mess. Both U. S. Robots and the academic world were bad people to antagonize.
Justice Shane said, "Since neither press, public nor jury is present, gentlemen, let us stand on as little ceremony as we can and get to the facts."
He smiled stiffly as he said this, perhaps without much hope that his request would be effective, and hitched at his robe so that he might sit more comfortably. His face was pleasantly rubicund, his chin round and soft, his nose broad and his eyes light in color and wide-set. All in all, it was not a face with much judicial majesty and the judge knew it.
Barnabas H. Goodfellow, Professor of Physics at Northeastern U., was sworn in first, taking the usual vow with an expression that made mincemeat of his name.
After the usual opening-gambit questions, Prosecution shoved his hands deep into his pockets and said, "When was it, Professor, that the matter of the possible employ of Robot EZ-27 was first brought to your attention, and how?"
Professor Goodfellow's small and angular face set itself into an uneasy expression, scarcely more benevolent than the one it replaced. He said, "I have had professional contact and some social acquaintance with Dr. Alfred Lanning, Director of Research at U. S. Robots. I was inclined to listen with some tolerance then when I received a rather strange suggestion from him on the third of March of last year-"
"Of 2033?"
"That's right."
"Excuse me for interrupting. Please proceed."
The professor nodded frostily, scowled to fix the facts in his mind, and began to speak.
Professor Goodfellow looked at the robot with a certain uneasiness. It had been carried into the basement supply room in a crate, in accordance with the regulations governing the shipment of robots from place to place on the Earth's surface.
He knew it was coming; it wasn't that he was unprepared. From the moment of Dr. Lanning's first phone call on March 3, he had felt himself giving way to the other's persuasiveness, and now, as an inevitable result, he found himself face to face with a robot.
It looked uncommonly large as it stood within arm's reach. Alfred Lanning cast a hard glance of his own at the robot, as though making certain it had not been damaged in transit. Then he turned his ferocious eyebrows and his mane of white hair in the professor's direction.
"This is Robot EZ-27, first of its model to be available for public use." He turned to the robot. "This is Professor Goodfellow, Easy."
Easy spoke impassively, but with such suddenness that the professor shied. "Good afternoon, Professor."
Easy stood seven feet tall and had the general proportions of a man-always the prime selling point of U. S. Robots. That and the possession of the basic patents on the positronic brain had given them an actual monopoly on robots and a near-monopoly on computing machines in general.
The two men who had uncrated the robot had left now and the professor looked from Lanning to the robot and back to Lanning. "It is harmless, I'm sure." He didn't sound sure.
"More harmless than I am," said Lanning. "I could be goaded into striking you. Easy could not be. You know the Three Laws of Robotics, I presume."
"Yes, of course," said Goodfellow.
"They are built into the positronic patterns of the brain and must be observed. The First Law, the prime rule of robotic existence, safeguards the life and well-being of all humans." He paused, rubbed at his cheek, then added, "It's something of which we would like to persuade all Earth if we could."
"It's just that he seems formidable."
"Granted. But whatever he seems, you'll find that he is useful."
"I'm not sure in what way. Our conversations were not very helpful in that respect. Still, I agreed to look at the object and I'm doing it."
"We'll do more than look, Professor. Have you brought a book?"
"I have."
"May I see it?"
Professor Goodfellow reached down without actually taking his eyes off the metal-in-human-shape that confronted him. From the briefcase at his feet, he withdrew a book.
Lanning held out his hand for it and looked at the backstrip. "Physical Chemistry of Electrolytes in Solution. Fair enough, sir. You selected this yourself, at random. It was no suggestion of mine, this particular text. Am I right?"
"Yes."
Lanning passed the book to Robot EZ-27.
The professor jumped a little. "No! That's a valuable book!" Lanning raised his eyebrows and they looked like shaggy coconut icing. He said, "Easy has no intention of tearing the book in two as a feat of strength, I assure you. It can handle a book as carefully as you or I. Go ahead, Easy."
"Thank you, sir," said Easy. Then, turning its metal bulk slightly, it added, "With your permission, Professor Goodfellow."
The professor stared, then said, "Yes-yes, of course."
With a slow and steady manipulation of metal fingers, Easy turned the pages of the book, glancing at the left page, then the right; turning the page, glancing left, then right; turning the page and so on for minute after minute.
The sense of its power seemed to dwarf even the large cement-walled room in which they stood and to reduce the two human watchers to something considerably less than life-size.
Goodfellow muttered, "The light isn't very good."
"It will do."
Then, rather more sharply, "But what is he doing?"
"Patience, sir."
The last page was turned eventually. Lanning asked, "Well, Easy?"
The robot said, "It is a most accurate book and there is little to which I can point. On line 22 of page 27, the word 'positive' is spelled p-o-i-s-t-i-v-e. The comma in line 6 of page 32 is superfluous, whereas one should have been used on line 13 of page 54. The plus sign in equation XIV-2 on page 337 should be a minus sign if it is to be consistent with the previous equations-"
"Wait! Wait!" cried the professor. "What is he doing?"
"Doing?" echoed Lanning in sudden irascibility. "Why, man, he has already done it! He has proofread that book."
"Proofread it?"
"Yes. In the short time it took him to turn those pages, he caught every mistake in spelling, grammar and punctuation. He has noted errors in word order and detected inconsistencies. And he will retain the information, letter-perfect, indefinitely."
The professor's mouth was open. He walked rapidly away from Lanning and Easy and as rapidly back. He folded his arms across his chest and stared at them. Finally he said, "You mean this is a proofreading robot?"
Lanning nodded. "Among other things."
"But why do you show it to me?"
"So that you might help me persuade the university to obtain it for use."
"To read proof?"
"Among other things," Lanning repeated patiently.
The professor drew his pinched face together in a kind of sour disbelief. "But this is ridiculous!"
"Why?"
"The university could never afford to buy this half-ton-it must weigh that at least-this half-ton proofreader."
"Proofreading is not all it will do. It will prepare reports from outlines, fill out forms, serve as an accurate memory-file, grade papers-"
All picayune!"
Lanning said, "Not at all, as I can show you in a moment. But I think we can discuss this more comfortably in your office, if you have no objection."
"No, of course not," began the professor mechanically and took a half-step as though to turn. Then he snapped out, "But the robot-we can't take the robot. Really, Doctor, you'll have to crate it up again."
"Time enough. We can leave Easy here."
"Unattended?"
"Why not? He knows he is to stay. Professor Goodfellow, it is necessary to understand that a robot is far more reliable than a human being."
"I would be responsible for any damage-"
"There will be no damage. I guarantee that. Look, it's after hours. You expect no one here, I imagine, before tomorrow morning. The truck and my two men are outside. U. S. Robots will take any responsibility that may arise. None will. Call it a demonstration of the reliability of the robot."
The professor allowed himself to be led out of the storeroom. Nor did he look entirely comfortable in his own office, five stories up.
He dabbed at the line of droplets along the upper half of his forehead with a white handkerchief.
"As you know very well, Dr. Lanning, there are laws against the use of robots on Earth's surface," he pointed out.
"The laws, Professor Goodfellow, are not simple ones. Robots may not be used on public thoroughfares or within public edifices. They may not be used on private grounds or within private structures except under certain restrictions that usually turn out to be prohibitive. The university, however, is a large and privately owned institution that usually receives preferential treatment. If the robot is used only in a specific room for only academic purposes, if certain other restrictions are observed and if the men and women having occasion to enter the room cooperate fully, we may remain within the law."
"But all that trouble just to read proof?"
"The uses would be infinite. Professor. Robotic labor has so far been used only to relieve physical drudgery. Isn't there such a thing as mental drudgery? When a professor capable of the most useful creative thought is forced to spend two weeks painfully checking the spelling of lines of print and I offer you a machine that can do it in thirty minutes, is that picayune?"
"But the price-"
"The price need not bother you. You cannot buy EZ-27. U. S. Robots does not sell its products. But the university can lease EZ-27 for a thousand dollars a year-considerably less than the cost of a single microwave spectograph continuous-recording attachment."
Goodfellow looked stunned. Lanning followed up his advantage by saying, "I only ask that you put it up to whatever group makes the decisions here. I would be glad to speak to them if they want more information."
"Well," Goodfellow said doubtfully, "I can bring it up at next week's Senate meeting. I can't promise that will do any good, though."
"Naturally," said Lanning.
The Defense Attorney was short and stubby and carried himself rather portentously, a stance that had the effect of accentuating his double chin. He stared at Professor Goodfellow, once that witness had been handed over, and said, "You agreed rather readily, did you not?"
The Professor said briskly, "I suppose I was anxious to be rid of Dr. Lanning. I would have agreed to anything."
"With the intention of forgetting about it after he left?"
"Well-"
"Nevertheless, you did present the matter to a meeting of the Executive Board of the University Senate."
"Yes, I did."
"So that you agreed in good faith with Dr. Lanning's suggestions. You weren't just going along with a gag. You actually agreed enthusiastically, did you not?"
"I merely followed ordinary procedures."
"As a matter of fact, you weren't as upset about the robot as you now claim you were. You know the Three Laws of Robotics and you knew them at the time of your interview with Dr. Lanning."
"Well, yes."
"And you were perfectly willing to leave a robot at large and unattended."
"Dr. Lanning assured me-"
"Surely you would never have accepted his assurance if you had had the slightest doubt that the robot might be in the least dangerous."
The professor began frigidly, "I had every faith in the word-"
"That is all," said Defense abruptly.
As Professor Goodfellow, more than a bit ruffled, stood down, Justice Shane leaned forward and said, "Since I am not a robotics man myself, I would appreciate knowing precisely what the Three Laws of Robotics are. Would Dr. Lanning quote them for the benefit of the court?"
Dr. Lanning looked startled. He had been virtually bumping heads with the gray-haired woman at his side. He rose to his feet now and the woman looked up, too-expressionlessly.
Dr. Lanning said, "Very well, Your Honor." He paused as though about to launch into an oration and said, with laborious clarity, "First Law: a robot may not injure a human being, or, through inaction, allow a human being to come to harm. Second Law: a robot must obey the orders given it by human beings, except where such orders would conflict with the First Law. Third Law: a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws."
"I see," said the judge, taking rapid notes. "These Laws are built into every robot, are they?"
"Into every one. That will be borne out by any roboticist."
"And into Robot EZ-27 specifically?"
"Yes, Your Honor."
"You will probably be required to repeat those statements under oath."
"I am ready to do so, Your Honor." He sat down again.
Dr. Susan Calvin, robopsychologist-in-chief for U. S. Robots, who was the gray-haired woman sitting next to Lanning, looked at her titular superior without favor, but then she showed favor to no human being. She said, "Was Goodfellow's testimony accurate,
Alfred?"
"Essentially," muttered Lanning. "He wasn't as nervous as all that about the robot and he was anxious enough to talk business with me when he heard the price. But there doesn't seem to be any drastic distortion."
Dr. Calvin said thoughtfully, "It might have been wise to put the price higher than a thousand."
"We were anxious to place Easy."
"I know. Too anxious, perhaps. They'll try to make it look as though we had an ulterior motive."
Lanning looked exasperated. "We did. I admitted that at the University Senate meeting."
"They can make it look as if we had one beyond the one we admitted."
Scott Robertson, son of the founder of U. S. Robots and still owner of a majority of the stock, leaned over from Dr. Calvin's other side and said in a kind of explosive whisper, "Why can't you get Easy to talk so we'll know where we're at?"
"You know he can't talk about it, Mr. Robertson."
"Make him. You're the psychologist, Dr. Calvin. Make him."
"If I'm the psychologist, Mr. Robertson," said Susan Calvin coldly, "let me make the decisions. My robot will not be made to do anything at the price of his well-being."
Robertson frowned and might have answered, but Justice Shane was tapping his gavel in a polite sort of way and they grudgingly fell silent.
Francis J. Hart, head of the Department of English and Dean of Graduate Studies, was on the stand. He was a plump man, meticulously dressed in dark clothing of a conservative cut, and possessing several strands of hair traversing the pink top of his cranium. He sat well back in the witness chair with his hands folded neatly in his lap and displaying, from time to time, a tight-lipped smile.
He said, "My first connection with the matter of the Robot EZ-27 was on the occasion of the session of the University Senate Executive Committee at which the subject was introduced by Professor Goodfellow. Thereafter, on the tenth of April of last year, we held a special meeting on the subject, during which I was in the chair."
"Were minutes kept of the meeting of the Executive Committee? Of the special meeting, that is?"
"Well, no. It was a rather unusual meeting." The dean smiled briefly. "We thought it might remain confidential."
"What transpired at the meeting?"
Dean Hart was not entirely comfortable as chairman of that meeting. Nor did the other members assembled seem completely calm. Only Dr. Lanning appeared at peace with himself. His tall, gaunt figure and the shock of white hair that crowned him reminded Hart of portraits he had seen of Andrew Jackson.
Samples of the robot's work lay scattered along the central regions of the table and the reproduction of a graph drawn by the robot was now in the hands of Professor Minott of Physical Chemistry. The chemist's lips were pursed in obvious approval.
Hart cleared his throat and said, "There seems no doubt that the robot can perform certain routine tasks with adequate competence. I have gone over these, for instance, just before coming in and there is very little to find fault with."
He picked up a long sheet of printing, some three times as long as the average book page. It was a sheet of galley proof, designed to be corrected by authors before the type was set up in page form. Along both of the wide margins of the galley were proofmarks, neat and superbly legible. Occasionally, a word of print was crossed out and a new word substituted in the margin in characters so fine and regular it might easily have been print itself. Some of the corrections were blue to indicate the original mistake had been the author's, a few in red, where the printer had been wrong.
"Actually," said Lanning, "there is less than very little to find fault with. I should say there is nothing at all to find fault with, Dr. Hart. I'm sure the corrections are perfect, insofar as the original manuscript was. If the manuscript against which this galley was corrected was at fault in a matter of fact rather than of English, the robot is not competent to correct it."
"We accept that. However, the robot corrected word order on occasion and I don't think the rules of English are sufficiently hidebound for US to be sure that in each case the robot's choice was the correct one."
"Easy's positronic brain," said Lanning, showing large teeth as he smiled, "has been molded by the contents of all the standard works on the subject. I'm sure you cannot point to a case where the robot's choice was definitely the incorrect one."
Professor Minott looked up from the graph he still held. "The question in my mind, Dr. Lanning, is why we need a robot at all, with all the difficulties in public relations that would entail. The science of automation has surely reached the point where your company could design a machine, an ordinary computer of a type known and accepted by the public, that would correct galleys."
"I am sure we could," said Lanning stiffly, "but such a machine would require that the galleys be translated into special symbols or, at the least, transcribed on tapes. Any corrections would emerge in symbols. You would need to keep men employed translating words to symbols, symbols to words. Furthermore, such a computer could do no other job. It couldn't prepare the graph you hold in your hand, for instance."
Minott grunted.
Lanning went on. "The hallmark of the positronic robot is its flexibility. It can do a number of jobs. It is designed like a man so that it can use all the tools and machines that have, after all, been designed to be used by a man. It can talk to you and you can talk to it. You can actually reason with it up to a point. Compared to even a simple robot, an ordinary computer with a non-positronic brain is only a heavy adding machine."
Goodfellow looked up and said, "If we all talk and reason with the robot, what are the chances of our confusing it? I suppose it doesn't have the capability of absorbing an infinite amount of data."
"No, it hasn't. But it should last five years with ordinary use. It will know when it will require clearing, and the company will do the job without charge."
"The company will?"
"Yes. The company reserves the right to service the robot outside the ordinary course of its duties. It is one reason we retain control of our positronic robots and lease rather than sell them. In the pursuit of its ordinary functions, any robot can be directed by any man. Outside its ordinary functions, a robot requires expert handling, and that we can give it. For instance, any of you might clear an EZ robot to an extent by telling it to forget this item or that. But you would be almost certain to phrase the order in such a way as to cause it to forget too much or too little. We would detect such tampering, because we have built-in safeguards. However, since there is no need for clearing the robot in its ordinary work, or for doing other useless things, this raises no problem."
Dean Hart touched his head as though to make sure his carefully cultivated strands lay evenly distributed and said, "You are anxious to have us take the machine. Yet surely it is a losing proposition for U. S. Robots. One thousand a year is a ridiculously low price. Is it that you hope through this to rent other such machines to other universities at a more reasonable price?"
"Certainly that's a fair hope," said Lanning.
"But even so, the number of machines you could rent would be limited. I doubt if you could make it a paying proposition."
Lanning put his elbows on the table and earnestly leaned forward. "Let me put it bluntly, gentlemen. Robots cannot be used on Earth, except in certain special cases, because of prejudice against them on the part of the public. U. S. Robots is a highly successful corporation with our extraterrestrial and spaceflight markets alone, to say nothing of our computer subsidiaries. However, we are concerned with more than profits alone. It is our firm belief that the use of robots on Earth itself would mean a better life for all eventually, even if a certain amount of economic dislocation resulted at first.
"The labor unions are naturally against us, but surely we may expect cooperation from the large universities. The robot, Easy, will help you by relieving you of scholastic drudgery-by assuming, if you permit it, the role of galley slave for you. Other universities and research institutions will follow your lead, and if it works out, then perhaps other robots of other types may be placed and the public's objections to them broken down by stages."
Minott murmured, "Today Northeastern University, tomorrow the world."
Angrily, Lanning whispered to Susan Calvin, "I wasn't nearly that eloquent and they weren't nearly that reluctant. At a thousand a year, they were jumping to get Easy. Professor Minott told me he'd never seen as beautiful a job as that graph he was holding and there was no mistake on the galley or anywhere else. Hart admitted it freely."
The severe vertical lines on Dr. Calvin's face did not soften. "You should have demanded more money than they could pay, Alfred, and let them beat you down."
"Maybe," he grumbled.
Prosecution was not quite done with Professor Hart. "After Dr. Lanning left, did you vote on whether to accept Robot EZ-27?"
"Yes, we did."
"With what result?"
"In favor of acceptance, by majority vote."
"What would you say influenced the vote?" Defense objected immediately.
Prosecution rephrased the question. "What influenced you, personally, in your individual vote? You did vote in favor, I think."
"I voted in favor, yes. I did so largely because I was impressed by Dr. Lanning's feeling that it was our duty as members of the world's intellectual leadership to allow robotics to help Man in the solution of his problems."
"In other words, Dr. Lanning talked you into it."
"That's his job. He did it very well."
"Your witness."
Defense strode up to the witness chair and surveyed Professor Hart for a long moment. He said, "In reality, you were all pretty eager to have Robot EZ-27 in your employ, weren't you?"
"We thought that if it could do the work, it might be useful."
"If it could do the work? I understand you examined the samples of Robot EZ-27's original work with particular care on the day of the meeting which you have just described."
"Yes, I did. Since the machine's work dealt primarily with the handling of the English language, and since that is my field of competence, it seemed logical that I be the one chosen to examine the work."
"Very good. Was there anything on display on the table at the time of the meeting which was less than satisfactory? I have all the material here as exhibits. Can you point to a single unsatisfactory item?"
"Well-"
"It's a simple question. Was there one single solitary unsatisfactory item? You inspected it. Was there?"
The English professor frowned. "There wasn't."
"I also have some samples of work done by Robot EZ-27 during the course of his fourteen-month employ at Northeastern. Would you examine these and tell me if there is anything wrong with them in even one particular?"
Hart snapped, "When he did make a mistake, it was a beauty."
"Answer my question," thundered Defense, "and only the question I am putting to you! Is there anything wrong with the material?"
Dean Hart looked cautiously at each item. "Well, nothing."
"Barring the matter concerning which we are here engaged. do you know of any mistake on the part of EZ-27?"
"Barring the matter for which this trial is being held, no."
Defense cleared his throat as though to signal end of paragraph. He said. "Now about the vote concerning whether Robot EZ-27 was to be employed or not. You said there was a majority in favor. What was the actual vote?"
"Thirteen to one, as I remember."
"Thirteen to one! More than just a majority, wouldn't you say?"
"No, sir!"All the pedant in Dean Hart was aroused. "In the English language, the word 'majority' means 'more than half.' Thirteen out of fourteen is a majority, nothing more."
"But an almost unanimous one."
"A majority all the same!"
Defense switched ground. "And who was the lone holdout?"
Dean Hart looked acutely uncomfortable. "Professor Simon Ninheimer."
Defense pretended astonishment. "Professor Ninheimer? The head of the Department of Sociology?"
"Yes, Sir."
"The plaintiff?"
"Yes, sir."
Defense pursed his lips. "In other words, it turns out that the man bringing the action for payment of $750,000 damages against my client. United States Robots and Mechanical Men Corporation was the one who from the beginning opposed the use of the robot-although everyone else on the Executive Committee of the University Senate was persuaded that it was a good idea."
"He voted against the motion, as was his right."
"You didn't mention in your description of the meeting any remarks made by Professor Ninheimer. Did he make any?"
"I think he spoke."
"You think?"
"Well, he did speak."
"Against using the robot?"
"Yes."
"Was he violent about it?"
Dean Hart paused. "He was vehement."
Defense grew confidential. "How long have you known Professor Ninheimer, Dean Hart?"
"About twelve years."
"Reasonably well?"
"I should say so, yes."
"Knowing him, then, would you say he was the kind of man who might continue to bear resentment against a robot, all the more so because an adverse vote had-"
Prosecution drowned out the remainder of the question with an indignant and vehement objection of his own. Defense motioned the witness down and Justice Shane called luncheon recess.
Robertson mangled his sandwich. The corporation would not founder for loss of three-quarters of a million, but the loss would do it no particular good. He was conscious, moreover, that there would be a much more costly long-term setback in public relations.
He said sourly, "Why all this business about how Easy got into the university? What do they hope to gain?"
The Attorney for Defense said quietly, "A court action is like a chess game, MI. Robertson. The winner is usually the one who can see more moves ahead, and my friend at the prosecutor's table is no beginner. They can show damage; that's no problem. Their main effort lies in anticipating our defense. They must be counting on us to try to show that Easy couldn't possibly have committed the offense-because of the Laws of Robotics."
"All right," said Robertson, "that is our defense. An absolutely airtight one."
"To a robotics engineer. Not necessarily to a judge. They're setting themselves up a position from which they can demonstrate that EZ-27 was no ordinary robot. It was the first of its type to be offered to the public. It was an experimental model that needed field-testing and the university was the only decent way to provide such testing. That would look plausible in the light of Dr. Lanning's strong efforts to place the robot and the willingness of U. S. Robots to lease it for so little. The prosecution would then argue that the field-test proved Easy to have been a failure. Now do you see the purpose of what's been going on?"
"But EZ-27 was a perfectly good model," Argued Robertson. "It was the twenty-seventh in production."
"Which is really a bad point," said Defense somberly. "What was wrong with the first twenty-six? Obviously something. Why shouldn't there be something wrong with the twenty-seventh, too?"
"There was nothing wrong with the first twenty-six except that they weren't complex enough for the task. These were the first positronic brains of the sort to be constructed and it was rather hit-and-miss to begin with. But the Three Laws held in all of them! No robot is so imperfect that the Three Laws don't hold."
"Dr. Lanning has explained this to me, Mr. Robertson, and I am willing to take his word for it. The judge, however, may not be. We are expecting a decision from an honest and intelligent man who knows no robotics and thus may be led astray. For instance, if you or Dr. Lanning or Dr. Calvin were to say on the stand that any positronic brains were constructed 'hit-and-miss,' as you just did, prosecution would tear you apart in cross-examination. Nothing would salvage our case. So that's something to avoid."
Robertson growled, "If only Easy would talk."
Defense shrugged. "A robot is incompetent as a witness, so that would do us no good."
"At least we'd know some of the facts. We'd know how it came to do such a thing."
Susan Calvin fired up. A dullish red touched her cheeks and her voice had a trace of warmth in it. "We know how Easy came to do it. It was ordered to! I've explained this to counsel and I'll explain it to you now."
"Ordered to by whom?" asked Robertson in honest astonishment. (No one ever told him anything, he thought resentfully. These research people considered themselves the owners of U. S. Robots, by God!)
"By the plaintiff," said Dr. Calvin. "In heaven's name, why?"
"I don't know why yet. Perhaps just that we might be sued, that he might gain some cash." There were blue glints in her eyes as she said that.
"Then why doesn't Easy say so?"
"Isn't that obvious? It's been ordered to keep quiet about the matter."
"Why should that be obvious?" demanded Robertson truculently. "Well, it's obvious to me. Robot psychology is my profession. If
Easy will not answer questions about the matter directly, he will answer questions on the fringe of the matter. By measuring increased hesitation in his answers as the central question is approached, by measuring the area of blankness and the intensity of counterpotentials set up, it is possible to tell with scientific precision that his troubles are the result of an order not to talk, with its strength based on First Law. In other words, he's been told that if he talks, harm will be done a human being. Presumably harm to the unspeakable Professor Ninheimer, the plaintiff, who, to the robot, would seem a human being."
"Well, then," said Robertson, "can't you explain that if he keeps quiet, harm will be done to U. S. Robots?"
"U. S. Robots is not a human being and the First Law of Robotics does not recognize a corporation as a person the way ordinary laws do. Besides, it would be dangerous to try to lift this particular sort of inhibition. The person who laid it on could lift it off least dangerously, because the robot's motivations in that respect are centered on that person. Any other course-" She shook her head and grew almost impassioned. "I won't let the robot be damaged!"
Lanning interrupted with the air of bringing sanity to the problem. "It seems to me that we have only to prove a robot incapable of the act of which Easy is accused. We can do that."
"Exactly," said Defense, in annoyance. "You can do that. The only witnesses capable of testifying to Easy's condition and to the nature of Easy's state of mind are employees of U. S. Robots. The judge can't possibly accept their testimony as unprejudiced."
"How can he deny expert testimony?"
"By refusing to be convinced by it. That's his right as the judge. Against the alternative that a man like Professor Ninheimer deliberately set about ruining his own reputation, even for a sizable sum of money, the judge isn't going to accept the technicalities of your engineers. The judge is a man, after all. If he has to choose between a man doing an impossible thing and a robot doing an impossible thing, he's quite likely to decide in favor of the man."
"A man can do an impossible thing," said Lanning, "because we don't know all the complexities of the human mind and we don't know what, in a given human mind, is impossible and what is not. We do know what is really impossible to a robot."
"Well, we'll see if we can't convince the judge of that," Defense replied wearily.
"If all you say is so," rumbled Robertson, "I don't see how you can."
"We'll see. It's good to know and be aware of the difficulties involved, but let's not be too downhearted. I've tried to look ahead a few moves in the chess game, too." With a stately nod in the direction of the robopsychologist, he added, "With the help of the good lady here."
Lanning looked from one to the other and said, "What the devil is this?"
But the bailiff thrust his head into the room and announced somewhat breathlessly that the trial was about to resume.
They took their seats, examining the man who had started all the trouble.
Simon Ninheimer owned a fluffy head of sandy hair, a face that narrowed past a beaked nose toward a pointed chin, and a habit of sometimes hesitating before key words in his conversation that gave him an air of a seeker after an almost unbearable precision. When he said, "The Sun rises in the-uh-east, 11 one was certain he had given due consideration to the possibility that it might at some time rise in the west.
Prosecution said, "Did you oppose employment of Robot EZ-27 by the university?"
"I did, sir."
"Why was that?"
"I did not feel that we understood the-uh-motives of U. S. Robots thoroughly. I mistrusted their anxiety to place the robot with us."
"Did you feel that it was capable of doing the work that it was allegedly designed to do?"