I, Robot

by

Isaac Asimov

I, Robot: Little Lost Robot Summary & Analysis

Summary
Analysis
The reporter waits two days before interviewing Calvin again. He asks to hear more stories about robots and interstellar travel. She starts to narrate her first experience with interstellar travel, in 2029, when a robot had been lost. Because of this, all work on the Hyperatomic Drive came to a halt, and no one was allowed to enter or leave that area of space except for Calvin and Bogert, who were brought in on a special government patrol ship.
“Little Lost Robot” serves as another example (similar to “Runaround” and “Catch That Rabbit”) of how humans’ lack of understanding of the beings that they have created, and their inability to think through the consequences of their actions, lead to placing themselves in dangerous situations.
Themes
Human Superiority and Control Theme Icon
Irrationality, Fear, and Folly Theme Icon
During Calvin’s first dinner at the Hyper Base, Major-general Kallner, who is heading the project, explains what the issue is: they have 63 robots on one ship that look identical, but one of them is different in another way. Bogert, who knew about the alteration, ultimately confesses that the “lost” robot has been programmed with a weakened First Law of Robotics because of the kind of work that they need some of the robots to do. Kallner is desperate that a robot with a weakened First Law not be discovered.
As Calvin goes on to explain, the First Law of Robotics is the Law that keeps the other two in place, and by modifying or weakening it, robots can act in self-interest rather than in the interest of humans, essentially removing their ethical code. As Kallner notes here, if people were to find out about the action that they had taken, they would start to become (perhaps rightfully) fearful of what the robots might do.
Themes
Morality and Ethics Theme Icon
Irrationality, Fear, and Folly Theme Icon
Calvin asks why they modified the robot. Kallner tells her that their men work with radiation, but precautions are taken to protect them. However, it is impossible to explain that the men are safe to the robots, who believe that if they do not try to protect the men, they will come to harm through inaction. Thus, when the men expose themselves to radiation, the nearest robot tries to drag the men out, and the radiation destroys the robots. And so they kept only the first part of the Law, which reads “No robot may harm a human being.” They are not compelled to act if a human being is in danger.
Even though modifying the First Law of Robotics is a logical way of getting around the robots’ instincts to protect humans, it is clear that Kallner and the others did not think through the possible outcomes. Calvin presents several examples later on that show how this modification could allow robots to harm human beings, if indirectly. This ultimately leads to Calvin nearly coming to harm at the hands of Nestor 10, showing how this irrationality or lack of logical thinking leads to the humans’ folly.
Themes
Irrationality, Fear, and Folly Theme Icon
The next morning, Calvin goes to Bogert and insists that she should have been told about the adjustment. She tells him about the possible pitfalls: the First Law is the only thing that keeps the robots loyal to the humans, and without it, robots could easily enable the death of humans. Bogert says that she’s exhibiting a  “Frankenstein Complex.”
Asimov coins the term “Frankenstein Complex” here, meaning a fear of robots, or beings that men have created—a reference to Mary Shelley’s Frankenstein, in which a scientist with noble intentions creates a humanoid being that becomes a murderous monster. Asimov wrote these stories largely to show how the Frankenstein Complex is a misguided one, except in the cases where humans are the ones who lead to their own destruction because of their fear or irrationality.
Themes
Irrationality, Fear, and Folly Theme Icon
Quotes
Get the entire I, Robot LitChart as a printable PDF.
I, Robot PDF
Later in the morning, Calvin has an interview with Gerald Black—the last person to have seen the lost robot, Nestor 10. Calvin asks if there was anything unusual about the robots with the First Law modifications. Black says that they’re only cleverer—they have learned physics from the technicians—and they’re very calm and curious, which sometimes annoys him. When he wants something done quickly, sometimes they take their time.
Throughout the story, Calvin tries to combat the irrational thought and illogic of her peers and solve the problem of the lost robot through reasoning. Black’s statement that the robots have learned physics is the key detail that allows Calvin to solve the mystery of which robot is Nestor 10.
Themes
Irrationality, Fear, and Folly Theme Icon
Bogert asks Black about the morning he last saw Nestor 10. Black admits that he had been frustrated and behind schedule. Nestor 10 was bothering him to repeat an experiment he had abandoned a month ago, and so he told him angrily to “go lose [himself],” along with a series of derogatory remarks. Calvin dismisses Black, thanking him for his cooperation. Calvin then interviews all 63 robots over five hours, and all 63 seem the same to her. She realizes that Nestor 10 is lying because he is obeying the Second Law and trying to fulfill the urgent command that Black set for him.
Like Cutie’s actions in “Reason,” even though Nestor 10’s intentions might appear sinister on the surface, it quickly becomes clear that the robot is still following the laws of robotics. The humans’ assumption is that the robot might be behaving more and more like a human, less compelled by an ethical code. But in reality, the robot is more ethical than the humans would be and is simply following the directions Black inadvertently set.
Themes
Morality and Ethics Theme Icon
Artificial Intelligence, Consciousness, and Humanity Theme Icon
Calvin worries, however, that Nestor 10 is also trying to prove his superior intellect to the humans by completely stumping them, and again worries that he might want to harm humans. She tells Bogert that a modified robot could drop something heavy above a human, knowing that it could catch the weight before hitting the man. However, once the robot lets go of  the weight, it is no longer an active agent and through inaction could allow the weight to strike.
Even though Nestor 10 is technically obeying the command that the humans have set, there’s also an element in which Nestor 10 is playing into the idea that he can escape human control and prove his superiority. This again ties the two concepts of robot superiority and humans losing control of them.
Themes
Human Superiority and Control Theme Icon
Calvin and Bogert decide to perform tests directly on the response to the First Law. They drop a weight above a man 10 times in front of each robot (though a laser beam will prevent the weight from actually hitting the man). Each time, every single one of the 63 robots jumps up to save the man. Nestor 10 does so of free will, trying to blend in. Calvin decides they will have to try something else, particularly now that Nestor 10 must be aware that they are trying to conduct experiments to find him and is deliberately lying to them.
Calvin’s understanding that Nestor 10 is lying to them again proves Powell incorrect in his assertion that robots cannot consciously lie. Both “Liar” and “Little Lost Robot” present situations in which that is untrue, thus proving how humans are sometimes not wise or logical enough to foresee how their creations will develop.
Themes
Human Superiority and Control Theme Icon
Calvin designs another experiment: they will again drop a weight above a man, but there will be electric cables between the man and the robots, which they will tell the robots will electrocute them and cause their deaths (though this won’t actually be true). Normal robots will try to save the man, as the First Law takes precedence over the Third. But for Nestor 10, his desire for self-preservation will cause him to remain in his seat. When they conduct the experiment, however, none of the robots move. Calvin is horrified.
On the surface, this appears to be a repudiation of the idea that robots will always stick to their ethical code, as set out in the Three Laws of Robotics. But, when Calvin subsequently interviews the robots on why they decided to save the man, it shows that they are not inherently disobeying the Laws—they are simply experiencing an evolution in how they view the ethics of those Laws.
Themes
Morality and Ethics Theme Icon
Calvin interviews each robot again, asking to explain what happened. One of them, Number 28, says that if he had died trying to save the man, he wouldn’t have been able to save him anyway. Then the robots would be dead for no purpose, and couldn’t prevent future harm. She asks Number 28 if he thought of this idea himself. He answers no, that the robots were all talking the previous night, and that another robot had the idea. Calvin asks which one, but Number 28 says that he doesn’t know. Kallner is frustrated that Calvin has not found the solution, while Calvin insists that they have to separate the robots from each other or else she will resign and make the matter public.
Clearly, Nestor 10 is the robot who came up with the idea, which demonstrates that while the robots are still sticking to an ethical code, how they think about that ethical code is becoming much more complex. This idea starts to delve into moral philosophy, showing how the robots can be just as concerned about morality, and think about life and death in as complex a way as humans can.
Themes
Morality and Ethics Theme Icon
Calvin gets one last idea, remembering that one additional difference between Nestor 10 and the other robots is that Nestor 10 has learned physics. She and Bogert do another interview, telling the robots that they are going to conduct another experiment with a person in danger. Between the robot and the person will be a gamma ray field, which Bogert tells them will kill them instantly—so it would again be pointless for them to try and save the person.
Of all of the people working on the problem, only Calvin is able to find the reasoning that will help her prove which robot is Nestor 10. But the fact that she is the chief robopsychologist yet it has taken her many, many experiments and interviews to finally find Nestor 10 demonstrates how the robots have become both physically and often mentally superior to the humans.
Themes
Human Superiority and Control Theme Icon
After the interviews, they begin the experiment, and Calvin insists on being the person that they must try to save so that she can observe the robots directly. The robots are then placed in a circle around her. The weight is dropped, but this time, only a single robot, Nestor 10, jerks upright and takes two steps. Nestor 10 stops dead. Calvin immediately tells all the other robots to leave the room, which they do. Nestor says that it was told to be lost, and that it must obey. He starts to walk toward Calvin and his metal arm flies out toward her. She sinks to the ground under the weight of his arm.
Even though the experiment ends successfully, it still demonstrates Calvin and the other humans’ folly. She is able to discover Nestor 10, but in the process puts herself in danger, particularly as she believes that Nestor 10 may be fully capable of harming her. Thus, she is nearly attacked by the robot in its quest to remain “lost.”
Themes
Irrationality, Fear, and Folly Theme Icon
Black bursts into the room and asks if Calvin is hurt. She shakes her head. He pries the arm off of her; Nestor 10 is sprawled next to her. Black explains that they turned on a gamma field when they saw Nestor 10 attack. Calvin admits he wasn’t really attacking her, simply trying to do so.
Even though Nestor 10 is trying to attack her, Calvin acknowledges that he can’t do so completely because he is still bound by the ethical principles of the First Law, illustrating how strictly the robots are bound to these imperatives.
Themes
Morality and Ethics Theme Icon
Calvin has one last meeting with Kallner, insisting that the modified Nestors be destroyed. Kallner agrees, and asks how she discovered him. Calvin explains that they didn’t flood the room with gamma rays, but with infrared rays instead. Only Nestor 10, who had knowledge of physics, knew that the rays were not gamma rays. He thought that all of the other robots would know this as well, and assumed that they would all jump up to save Calvin. He forgot that the other robots might be more ignorant than he was.
Asimov demonstrates the brilliance of Calvin’s plan, as she is able to trap Nestor 10 in his own belief of his superiority. The robot not only believes that he is superior, but that all robots are superior to humans. Thus, the very belief that Calvin fears is the belief that allows her to catch Nestor 10—proving that she has a fuller understanding of the robot’s logic and psychology.
Themes
Human Superiority and Control Theme Icon
Irrationality, Fear, and Folly Theme Icon
Quotes