Dr. Susan Calvin Quotes in I, Robot
Susan said nothing at that seminar; took no part in the hectic discussion period that followed. She was a frosty girl, plain and colorless, who protected herself against a world she disliked by a mask-like expression and a hypertrophy of intellect. But as she watched and listened, she felt the stirrings of a cold enthusiasm.
“Then you don’t remember a world without robots. There was a time when humanity faced the universe alone and without a friend. Now he has creatures to help him; stronger creatures than himself, more faithful, more useful, and absolutely devoted to him. […] But you haven’t worked with them, so you don’t know them. They’re a cleaner better breed than we are.”
But Susan Calvin whirled on him now and the hunted pain in her eyes became a blaze, “Why should I? What do you know about it all, anyway, you…you machine. I’m just a specimen to you; an interesting bug with a peculiar mind spread-eagled for inspection. It’s a wonderful example of frustration, isn’t it? Almost as good as your books.” Her voice, emerging in dry sobs, choked into silence.
The robot cowered at the outburst. He shook his head pleadingly. “Won’t you listen to me, please? I could help you if you would let me.”
“All normal life, Peter, consciously or otherwise, resents domination. If the domination is by an inferior, or by a supposed inferior, the resentment becomes stronger. Physically, and, to an extent, mentally, a robot—any robot—is superior to human beings. What makes him slavish, then? Only the First Law! […]”
“Susan,” said Bogert, with an air of sympathetic amusement. “I’ll admit that this Frankenstein Complex you’re exhibiting has a certain justification—hence the First Law in the first place. But the Law, I repeat and repeat, has not been removed—merely modified.”
“That he himself could only identify wave lengths by virtue of the training he had received at Hyper Base, under mere human beings, was a little too humiliating to remember for just a moment. To the normal robots the area was fatal because we had told them it would be, and only Nestor 10 knew we were lying. And just for a moment he forgot, or didn’t want to remember, that other robots might be more ignorant than human beings. His very superiority caught him.”
“When we come to a sheet which means damage, even maybe death, don’t get excited. You see, Brain, in this case, we don’t mind—not even about death; we don’t mind at all.”
She went on, “So he accepted the item, but not without a certain jar. Even with death temporary and its importance depressed, it was enough to unbalance him very gently.”
She brought it out calmly, “He developed a sense of humor—it’s an escape, you see, a method of partial escape from reality. He became a practical joker.”
“Actions such as his could come only from a robot, or from a very honorable and decent human being. But you see, you just can’t differentiate between a robot and the very best of humans.”
“I like robots. I like them considerably better than I do human beings. If a robot can be created capable of being a civil executive, I think he’d make the best one possible. By the Laws of Robotics, he’d be incapable of harming humans, incapable of tyranny, of corruption, of stupidity, of prejudice.” […]
“Except that a robot might fail due to the inherent inadequacies of his brain. The positronic brain has never equalled the complexities of the human brain.”
“Very well, then, Stephen, what harms humanity? Economic dislocations most of all, from whatever cause. Wouldn’t you say so?”
“I would.”
“And what is most likely in the future to cause economic dislocations? Answer that, Stephen.”
“I should say,” replied Byerley, unwillingly, “the destruction of the Machines.”
“And so should I say, and so should the Machines say. Their first care, therefore, is to preserve themselves, for us.”
“But you are telling me, Susan, that the ‘Society for Humanity’ is right; and that Mankind has lost its own say in its future.”
“It never had any, really. It was always at the mercy of economic and sociological forces it did not understand—at the whims of climate, and the fortunes of war.” […]
“How horrible!”
“Perhaps how wonderful! Think, that for all time, all conflicts are finally evitable. Only the Machines, from now on, are inevitable!”
Dr. Susan Calvin Quotes in I, Robot
Susan said nothing at that seminar; took no part in the hectic discussion period that followed. She was a frosty girl, plain and colorless, who protected herself against a world she disliked by a mask-like expression and a hypertrophy of intellect. But as she watched and listened, she felt the stirrings of a cold enthusiasm.
“Then you don’t remember a world without robots. There was a time when humanity faced the universe alone and without a friend. Now he has creatures to help him; stronger creatures than himself, more faithful, more useful, and absolutely devoted to him. […] But you haven’t worked with them, so you don’t know them. They’re a cleaner better breed than we are.”
But Susan Calvin whirled on him now and the hunted pain in her eyes became a blaze, “Why should I? What do you know about it all, anyway, you…you machine. I’m just a specimen to you; an interesting bug with a peculiar mind spread-eagled for inspection. It’s a wonderful example of frustration, isn’t it? Almost as good as your books.” Her voice, emerging in dry sobs, choked into silence.
The robot cowered at the outburst. He shook his head pleadingly. “Won’t you listen to me, please? I could help you if you would let me.”
“All normal life, Peter, consciously or otherwise, resents domination. If the domination is by an inferior, or by a supposed inferior, the resentment becomes stronger. Physically, and, to an extent, mentally, a robot—any robot—is superior to human beings. What makes him slavish, then? Only the First Law! […]”
“Susan,” said Bogert, with an air of sympathetic amusement. “I’ll admit that this Frankenstein Complex you’re exhibiting has a certain justification—hence the First Law in the first place. But the Law, I repeat and repeat, has not been removed—merely modified.”
“That he himself could only identify wave lengths by virtue of the training he had received at Hyper Base, under mere human beings, was a little too humiliating to remember for just a moment. To the normal robots the area was fatal because we had told them it would be, and only Nestor 10 knew we were lying. And just for a moment he forgot, or didn’t want to remember, that other robots might be more ignorant than human beings. His very superiority caught him.”
“When we come to a sheet which means damage, even maybe death, don’t get excited. You see, Brain, in this case, we don’t mind—not even about death; we don’t mind at all.”
She went on, “So he accepted the item, but not without a certain jar. Even with death temporary and its importance depressed, it was enough to unbalance him very gently.”
She brought it out calmly, “He developed a sense of humor—it’s an escape, you see, a method of partial escape from reality. He became a practical joker.”
“Actions such as his could come only from a robot, or from a very honorable and decent human being. But you see, you just can’t differentiate between a robot and the very best of humans.”
“I like robots. I like them considerably better than I do human beings. If a robot can be created capable of being a civil executive, I think he’d make the best one possible. By the Laws of Robotics, he’d be incapable of harming humans, incapable of tyranny, of corruption, of stupidity, of prejudice.” […]
“Except that a robot might fail due to the inherent inadequacies of his brain. The positronic brain has never equalled the complexities of the human brain.”
“Very well, then, Stephen, what harms humanity? Economic dislocations most of all, from whatever cause. Wouldn’t you say so?”
“I would.”
“And what is most likely in the future to cause economic dislocations? Answer that, Stephen.”
“I should say,” replied Byerley, unwillingly, “the destruction of the Machines.”
“And so should I say, and so should the Machines say. Their first care, therefore, is to preserve themselves, for us.”
“But you are telling me, Susan, that the ‘Society for Humanity’ is right; and that Mankind has lost its own say in its future.”
“It never had any, really. It was always at the mercy of economic and sociological forces it did not understand—at the whims of climate, and the fortunes of war.” […]
“How horrible!”
“Perhaps how wonderful! Think, that for all time, all conflicts are finally evitable. Only the Machines, from now on, are inevitable!”