While Asimov does delineate differences between the human and nonhuman figures in I, Robot, Asimov also shows how the societies in his stories have started to blend the two. He endows the robots with a technology called a “positronic brain,” giving them a form of consciousness that humans also possess. It is this consciousness that causes humans to anthropomorphize the robots’ actions, behaviors, and thought processes. Asimov demonstrates that the robots seem so sentient because humans can only understand the machines in terms of their own human behavior, and therefore project their own emotions and flaws onto the robots. This tendency for humans to treat the robots like fellow humans suggests that people are limited in their understanding of nonhuman objects and beings, and are thus inherently tempted to humanize and ascribe consciousness to the things they create, even when those creations are distinctly nonhuman.
One of the main ways in which Asimov endows the robots with human-like qualities throughout the book is through names and pronouns. These can be found in nearly all of the stories, where the model number of the robot is adapted into a name. For example, RB becomes Robbie, NS-2 becomes Nestor, and QT-1 becomes Cutie. In all of these cases, the robots are referred to as “he,” not as “it.” This simple reframing of language causes a shift in how the humans think about the robots—not as mechanical operating systems, but as sentient beings. This becomes clear in “Robbie,” when Gloria insists that Robbie is her friend: “‘He was not no machine!’ screamed Gloria, fiercely and ungrammatically. ‘He was a person just like you and me and he was my friend.’” Even though Robbie can’t talk or emote like a person can, the way that Gloria thinks about him makes him just as human to her as anyone else in her life.
In several other stories, humans start to anthropomorphize some of the robots’ behavior, demonstrating how they can only understand the robots in terms of human language. In “Runaround,” Speedy is conflicted between the Second and Third Laws of Robotics and subsequently teeters back and forth, caught in a loop. He also starts to quote Gilbert and Sullivan and other simple rhymes, clearly caught in some software error. Donovan concludes that Speedy is “drunk,” ascribing human behavior to him. It is impossible for Speedy to be drunk in the literal sense, of course, because he cannot drink. But in the behavioral sense, providing this description makes the robot seem even more human-like. In “Catch That Rabbit,” the robot Dave has six subsidiary robots that it controls, which Donovan and Powell call “fingers.” When there is an emergency, the robot does not have the decisiveness to control all six “fingers,” and therefore it does nothing. Powell describes that Dave is simply “twiddling his fingers,” once again ascribing a human behavioral trait in order to understand the scenario more fully. This occurs once more in “Escape!”, in which a supercomputer, which the humans call “The Brain,” is tasked with building a hyperspace ship that would cause humans to die temporarily. Because this goes against the First Law, The Brain develops a coping mechanism. As Dr. Susan Calvin describes, “He developed a sense of humor—it’s an escape, you see, a method of partial escape from reality. He became a practical joker.” Again, the humans can only fully understand the robot in terms of how they might cope with something upsetting, causing the robots to take on even more human qualities in their minds.
Of course, there are instances in which robots’ consciousness and emotional intelligence are shown to be limited as compared to humans. In “Liar!”, robot RB-34 (Herbie) is a mind-reading robot. When Calvin asks him what another officer named Milton Ashe thinks of her, the robot tells her that Ashe is in love with her, which elates her. But when this turns out not to be true, Calvin recognizes that the robot lied because it didn’t want to hurt Calvin’s feelings—despite the fact that lying to her ended up hurting her even more. Still, the fact that Calvin (and others to whom Herbie lied) though that Herbie would only tell her the truth proves that she believed the robot would act in the same way as a human, knowing that a lie would ultimately be more harmful. Even though Calvin is the chief robopsychologist at U.S. Robotics and Mechanical Men and should understand the differences between humans and robots, she still assumes that the robot bears similar attributes to a human and takes his word as such.
While some of the characters insist that the robots should be thought of purely as machines, it is impossible not to think of the robots as having human qualities. Asimov makes this point again and again, as he shows the human characters ascribing human attributes to their robotic counterparts. Ultimately, this demonstrates that people are limited in their human-centric perspective have an inherent tendency to ascribe human qualities onto nonhuman things, and that this can often lead to misunderstanding and frustration.
Artificial Intelligence, Consciousness, and Humanity ThemeTracker
Artificial Intelligence, Consciousness, and Humanity Quotes in I, Robot
Susan said nothing at that seminar; took no part in the hectic discussion period that followed. She was a frosty girl, plain and colorless, who protected herself against a world she disliked by a mask-like expression and a hypertrophy of intellect. But as she watched and listened, she felt the stirrings of a cold enthusiasm.
“Why do you cry, Gloria? Robbie was only a machine, just a nasty old machine. He wasn’t alive at all.”
“He was not no machine!” screamed Gloria, fiercely and ungrammatically. “He was a person just like you and me and he was my friend. I want him back. Oh, Mamma, I want him back.”
It was Powell who broke the desperate silence. “In the first place,” he said, “Speedy isn’t drunk—not in the human sense—because he’s a robot, and robots don’t get drunk. However, there’s something wrong with him which is the robotic equivalent of drunkenness.”
“To me, he’s drunk,” stated Donovan, emphatically, “and all I know is that he thinks we’re playing games. And we’re not. It’s a matter of life and very gruesome death.”
“Remember, those subsidiaries were Dave’s ‘fingers.’ We were always saying that, you know. Well, it’s my idea that in all these interludes, whenever Dave became a psychiatric case, he went off into a moronic maze, spending his time twiddling his fingers.”
But Susan Calvin whirled on him now and the hunted pain in her eyes became a blaze, “Why should I? What do you know about it all, anyway, you…you machine. I’m just a specimen to you; an interesting bug with a peculiar mind spread-eagled for inspection. It’s a wonderful example of frustration, isn’t it? Almost as good as your books.” Her voice, emerging in dry sobs, choked into silence.
The robot cowered at the outburst. He shook his head pleadingly. “Won’t you listen to me, please? I could help you if you would let me.”
She went on, “So he accepted the item, but not without a certain jar. Even with death temporary and its importance depressed, it was enough to unbalance him very gently.”
She brought it out calmly, “He developed a sense of humor—it’s an escape, you see, a method of partial escape from reality. He became a practical joker.”
“I like robots. I like them considerably better than I do human beings. If a robot can be created capable of being a civil executive, I think he’d make the best one possible. By the Laws of Robotics, he’d be incapable of harming humans, incapable of tyranny, of corruption, of stupidity, of prejudice.” […]
“Except that a robot might fail due to the inherent inadequacies of his brain. The positronic brain has never equalled the complexities of the human brain.”