Stephen Byerley Quotes in I, Robot
“Actions such as his could come only from a robot, or from a very honorable and decent human being. But you see, you just can’t differentiate between a robot and the very best of humans.”
“I like robots. I like them considerably better than I do human beings. If a robot can be created capable of being a civil executive, I think he’d make the best one possible. By the Laws of Robotics, he’d be incapable of harming humans, incapable of tyranny, of corruption, of stupidity, of prejudice.” […]
“Except that a robot might fail due to the inherent inadequacies of his brain. The positronic brain has never equalled the complexities of the human brain.”
“Very well, then, Stephen, what harms humanity? Economic dislocations most of all, from whatever cause. Wouldn’t you say so?”
“I would.”
“And what is most likely in the future to cause economic dislocations? Answer that, Stephen.”
“I should say,” replied Byerley, unwillingly, “the destruction of the Machines.”
“And so should I say, and so should the Machines say. Their first care, therefore, is to preserve themselves, for us.”
“But you are telling me, Susan, that the ‘Society for Humanity’ is right; and that Mankind has lost its own say in its future.”
“It never had any, really. It was always at the mercy of economic and sociological forces it did not understand—at the whims of climate, and the fortunes of war.” […]
“How horrible!”
“Perhaps how wonderful! Think, that for all time, all conflicts are finally evitable. Only the Machines, from now on, are inevitable!”
Stephen Byerley Quotes in I, Robot
“Actions such as his could come only from a robot, or from a very honorable and decent human being. But you see, you just can’t differentiate between a robot and the very best of humans.”
“I like robots. I like them considerably better than I do human beings. If a robot can be created capable of being a civil executive, I think he’d make the best one possible. By the Laws of Robotics, he’d be incapable of harming humans, incapable of tyranny, of corruption, of stupidity, of prejudice.” […]
“Except that a robot might fail due to the inherent inadequacies of his brain. The positronic brain has never equalled the complexities of the human brain.”
“Very well, then, Stephen, what harms humanity? Economic dislocations most of all, from whatever cause. Wouldn’t you say so?”
“I would.”
“And what is most likely in the future to cause economic dislocations? Answer that, Stephen.”
“I should say,” replied Byerley, unwillingly, “the destruction of the Machines.”
“And so should I say, and so should the Machines say. Their first care, therefore, is to preserve themselves, for us.”
“But you are telling me, Susan, that the ‘Society for Humanity’ is right; and that Mankind has lost its own say in its future.”
“It never had any, really. It was always at the mercy of economic and sociological forces it did not understand—at the whims of climate, and the fortunes of war.” […]
“How horrible!”
“Perhaps how wonderful! Think, that for all time, all conflicts are finally evitable. Only the Machines, from now on, are inevitable!”