In addition to exploring the nature of robots, Asimov also delves into some of the perceptions and motivations that drive human behavior. Whereas Asimov’s robots are generally shown to be logical and rigidly abide by the laws that have been programmed into them, human characters are shown to be deeply irrational, often spurred by their fear of the very robots they have created. In I, Robot, Asimov shows how this irrationality frequently drives human beings to acts of folly that can potentially lead to their own demise.
In “Robbie,” Mrs. Weston’s irrational fear of the robot Robbie causes her to take actions that almost lead to the death of her daughter Gloria. After two years of the nursemaid robot Robbie taking care of Gloria, Mrs. Weston gets rid of Robbie, thinking that something might go wrong with him that would cause him to hurt Gloria. She believes that this is a logical way to protect her daughter, but doesn’t anticipate the responses of the rest of her family. Gloria is disconsolate, and Mr. Weston caves under Gloria’s depressive episodes. This leads Mr. Weston to take Gloria on a tour of a robot factory, where he has secretly planted Robbie so that they can reunite. When Gloria sees Robbie, she immediately runs to him, and she is almost killed by a moving vehicle in the factory but is saved by Robbie. This is particularly ironic, because Mrs. Weston’s fear of Gloria getting hurt is exactly why she wanted to get rid of Robbie in the first place. In the end, however, it was human actions that almost lead to Gloria’s death. Thus, Mrs. Weston’s irrational fear of robots is what almost causes her own downfall, not the robot itself.
The officials at U.S. Robotics and Mechanical Men experience several similar follies. Their concern that the robots are starting to outsmart them leads them to take actions that ultimately endanger themselves. In “Catch that Rabbit,” Powell and Donovan are testing a new model of robot, DV-5 (Dave) at an asteroid mining station. But when they are not around, Dave stops producing ore, and he isn’t able to tell them what he was doing during the shifts when he was supposed to be mining, as if he had blacked out. When Powell and Donovan attempt to observe Dave, however, the robot functions perfectly. They grow increasingly concerned that the robot is lying to them: Donovan expresses, “There’s something—sinister—about—that,” pounding the desk in desperation. They learn from other robots that Dave stops working when there are threats of emergency cave-ins, and so in order to observe what happens to the robot in these situations, they try to create an emergency. In the process, they accidently trap themselves in a cave-in. Fortunately, they find a way to attract Dave’s attention to rescue them, but the story serves as another way of emphasizing that the humans’ irrational fear of Dave’s actions is what leads to their dire situation, not Dave’s actions themselves.
Robots are frequently the solutions to humans’ problems, not the instigators. In “Little Lost Robot,” a military research station has modified the First Law of Robotics in one of the NS-2 (Nestor) models, so that the First Law they follow is only that robots may not harm any human (i.e., robots may now allow human beings to come to harm by inaction). When they “lose” one of these robots among 62 others and cannot tell which one has been modified, they call Susan Calvin to help identify the robot. Calvin becomes incredibly fearful, worried that the robot could be smart enough to find ways around the First Law entirely, now that it has been weakened. She also grows concerned the potential outcry that could follow if people discover that they have done this. Subsequently, she puts herself in danger in order to try to discover which robot is Nestor 10, and the robot even attempts to attack her directly. Thus, like Donovan and Powell, Calvin’s fear leads to an irrational method of investigating the robot, which only puts herself in harm’s way.
In “Escape!” humans come to a similar potential harm, even though their motivations are less driven by fear. U.S. Robots and Mechanical workers want to build a hyperspace ship, but they worry that the computer building the ship, The Brain, will malfunction. They are concerned because the hyperspace jump means that the men on board will temporarily cease to exist (effectively, dying temporarily), and the robots are programmed to make sure that human beings do not come to harm. Dr. Calvin, therefore, tells The Brain, “When we come to a sheet which means damage, even maybe death, don’t get excited. You see, Brain, in this case, we don’t mind—not even about death; we don’t mind at all.” Dr. Calvin is not spurred by the fear of robots, but her irrationality nearly leads to the death of the men on board, as The Brain assumes full control of the ship and only gives the men beans and milk on which to survive. She does not follow her words to their rational conclusion—that The Brain could allow the humans to die by any means—and therefore places them in grave danger.
In this collection of short stories, Asimov coins the term “Frankenstein complex.” This refers to a fear of mechanical men and robots, driven by the belief that robots will either replace or dominate them. Yet Asimov proves this fear to be an irrational one, not only because the robots are so ethical and bound to the Three Laws of Robotics which humans set out for them, but also because the humans’ fear often leads them to put themselves in even more dangerous situations than the theoretical harm they might have experienced at the hands of those robots.
Irrationality, Fear, and Folly ThemeTracker
Irrationality, Fear, and Folly Quotes in I, Robot
“You listen to me, George. I won’t have my daughter entrusted to a machine—and I don’t care how clever it is. It has no soul, and no one knows what it may be thinking. A child just isn’t made to be guarded by a thing of metal.”
It took split-seconds for Weston to come to his senses, and those split-seconds meant everything, for Gloria could not be overtaken. Although Weston vaulted the railing in a wild attempt, it was obviously hopeless. Mr. Struthers signalled wildly to the overseers to stop the tractor, but the overseers were only hu man and it took time to act.
It was only Robbie that acted immediately and with precision.
He called a last time, desperately: “Speedy! I’m dying, damn you! Where are you? Speedy, I need you.”
He was still stumbling backward in a blind effort to get away from the giant robot he didn’t want, when he felt steel fingers on his arms, and a worried, apologetic voice of metallic timbre in his ears.
Donovan pounded the desk, “But, Greg, he only goes wrong when we’re not around. There’s something—sinister—about— that.” He punctuated the sentence with slams of fist against desk.
“All normal life, Peter, consciously or otherwise, resents domination. If the domination is by an inferior, or by a supposed inferior, the resentment becomes stronger. Physically, and, to an extent, mentally, a robot—any robot—is superior to human beings. What makes him slavish, then? Only the First Law! […]”
“Susan,” said Bogert, with an air of sympathetic amusement. “I’ll admit that this Frankenstein Complex you’re exhibiting has a certain justification—hence the First Law in the first place. But the Law, I repeat and repeat, has not been removed—merely modified.”
“When we come to a sheet which means damage, even maybe death, don’t get excited. You see, Brain, in this case, we don’t mind—not even about death; we don’t mind at all.”
“But you are telling me, Susan, that the ‘Society for Humanity’ is right; and that Mankind has lost its own say in its future.”
“It never had any, really. It was always at the mercy of economic and sociological forces it did not understand—at the whims of climate, and the fortunes of war.” […]
“How horrible!”
“Perhaps how wonderful! Think, that for all time, all conflicts are finally evitable. Only the Machines, from now on, are inevitable!”