I, Robot

by

Isaac Asimov

I, Robot: The Evitable Conflict Summary & Analysis

Summary
Analysis
Stephen Byerley and Susan Calvin meet in his study. He is now World Coordinator of Earth. He has noted some problems with the Machines that run the economy: certain parts of the economy are producing some small imbalances. The Mexican Canal is two months behind schedule, the mercury mines at Almaden have a production deficiency, and the Hydroponics plant is laying people off.
The final story in Asimov’s book demonstrates how the robots have become incredibly complex: not only does he include the new Machines, which are now running the human economy, but people still have not figured out whether Byerley is or is not be a robot, further showing how robots have become more sophisticated and lifelike.
Themes
Artificial Intelligence, Consciousness, and Humanity Theme Icon
Byerley explains that all periods of human development have led to “inevitable conflicts,” where groups have fought against each other, right up through the 20th century, with its nationalist and ideological wars. But when the robots came, the world changed, and it “no longer seemed so important whether the world was Adam Smith or Karl Marx.” But Earth’s economy is now stable because of the calculating Machines that control it, and which protect humanity through the First Law. The economic balance put an end to all wars.
Byerley emphasizes an important moral of this story and the book as a whole: when humans are left to maintain control of themselves, it often leads to conflict and violence that only hurts themselves. When the Machines are able to regulate the humans, the humans are much better for it. The Machine regulation eradicates the needs for different economic theories and divisions, because the Machines ensure that the economy is helping the most people possible.
Themes
Morality and Ethics Theme Icon
Human Superiority and Control Theme Icon
Byerley goes on to say that these errors, however, may lead to another war, because the Machines may not be fulfilling their function. These errors should not exist, because the Machines are self-correcting, and they should not harm people in this way. He says that they have no way to check exactly what is wrong, however, because the complexity of what the Machines do has far exceeded human knowledge. They analyze a nearly infinite amount of data in very little time. Calvin assents that they “no longer understand [their] own creations.”
Because the Machines follow the First Law, they are more ethical than the humans because they are bound to ensure humanity’s security as a whole. However, Asimov implies here that the Machines are also superior in general. Because of their complexity and ability to sort through tons of data at a time in ways that humans can never know, the humans have essentially lost control of how they operate.
Themes
Morality and Ethics Theme Icon
Human Superiority and Control Theme Icon
Byerley says they asked the Machine for explanations for these economic aberrations, but the Machine could not answer. Byerley gives his theories: first, that the Machines might have been given the wrong data, and thus the error was on the human end, not the Machine end. Byerley and Calvin start to go through each region of Earth to look at the data and speak to each Regional Coordinator.
Asimov also shows how people have come a long way from the fear of trusting robots that was exhibited in “Robbie” and “Reason.” Byerley trusts the Machines’ precision more than he trusts that of the humans, and therefore assumes human error first.
Themes
Irrationality, Fear, and Folly Theme Icon
Get the entire I, Robot LitChart as a printable PDF.
I, Robot PDF
Byerley starts with the Eastern Region, coordinated by Ching Hso-Lin. Ching explains that over the last few months, some of their synthetic food plants have had to be shut down. This is because fads and the changing popularity of foods sometimes requires different equipment, and therefore also different people to run that equipment. He explains that the Machines usually predict these changes well, but sometimes there is too much production of a product that goes out of fashion.
Ching’s assessment seems also to agree with Byerley: the error stems from the somewhat unpredictable nature of human behavior, rather than the errors in the Machines. This implies that the Machines might not be so advanced, because they cannot fully account for human behavior. But in reality, because this is not actually the reason for the errors, it shows the true advancement of the Machines. 
Themes
Human Superiority and Control Theme Icon
Ching does point out one odd incident: a man named Rama Vrasayana was running a plant which was forced to close due to competition. It was strange that the Machine did not warn Vrasayana to renovate. Otherwise, Ching assures Byerley and Calvin that this is the only issue they’ve had.
This story foreshadows the eventual reveal of what is wrong. The fact that the Machines do not do their bound duty of helping Vrasayana indicates that the man may be up to something that puts people in danger, and that the robots may thus be trying to protect humanity as a whole.
Themes
Human Superiority and Control Theme Icon
Byerley goes through the same steps with the Tropic Region, coordinated by Lincoln Ngoma. Ngoma explains that they’re a little short on labor to finish the Mexican Canal. He refers to an incident in which Francisco Villafranca, the engineer in charge, was involved in a cave-in which set the project back. The Machine later reported that Villafranca’s calculations were off. But Villafranca claimed the Machine had given him different data the first time, and that he had followed the Machine faithfully.
The incident that Ngoma describes subtly illustrates how there has been a deep shift in the trust that humans place in robots and the Machines due to their advancement. Whereas before humans like Mrs. Weston, Donovan and Powell distrusted the robots at times, here Ngoma takes the Machine’s word over a fellow human’s.
Themes
Irrationality, Fear, and Folly Theme Icon
Artificial Intelligence, Consciousness, and Humanity Theme Icon
Ngoma notes that it makes sense that Villafranca would blame the Machine—particularly because he attended conferences by the Society for Humanity, an anti-Machine group that grew out of the Fundamentalists.
This reveal serves as another hint at the underlying cause of the Machine’s actions—how they are aiming to gain control over the organization that poses the most threat to them.
Themes
Human Superiority and Control Theme Icon
Next, Byerley and Calvin meet with the Coordinator of the European Region, named Madame Szegeczowska. They talk about the mercury mines being behind on production. Szegeczowska says that Almaden is run by a Northern company that is connected with the Society for Humanity; she worries that they have not been consulting the Machines. She assures them, however, that the company is being sold to a group of Spaniards and that there will likely be no more trouble.
Szegeczowska’s statements draw a very similar conclusion as Ngoma’s. In both cases they do not doubt the Machines that they have come to rely on; instead, they fear that the humans are the ones who are causing the errors. This makes sense, because Asimov has shown throughout the book that many humans have made irrational mistakes based on fears like the ones the Society for Humanity has.
Themes
Irrationality, Fear, and Folly Theme Icon
Byerley and Calvin finish with the Northern Region, coordinated by Hiram Mackenzie. Mackenzie refutes the idea that the Machines are coming up with errors because of incorrect data, because they recognize outliers in what they are given. Byerley asks how he accounts for the errors. Mackenzie takes the example of someone buying cotton textiles: there is no quantitative data to predict what might feel good to a person when they are buying something, and they cannot explain it themselves. Therefore, in these cases, there is no data to give the Machines. The human brain is subjective and inconsistent.
Again, just as with the previous two coordinators, Mackenzie focuses on the human errors rather than the potential errors (or intentional problems) the Machines could be creating. This once again reinforces how human beings may not be intelligent or logical enough to understand the motivations and actions of the robots that they have created, and have thus lost full control of them.
Themes
Human Superiority and Control Theme Icon
Byerley and Calvin regroup with what they have been told. They wonder if people are deliberately disobeying the Machines so they might be able to gain greater economic status or power than the other regions. Byerley also notes the links of many of these stories to the Society for Humanity. Both Villafranca and Vrasayana were members of the group, and the company at Almaden also had links to it. Thus, they must have been mistrustful of the Machines and disobeyed them. He resolves to have the Society outlawed, and every member removed from government posts.
Byerley and Calvin recognize the fears of the Society for Humanity, but instead of giving them credence, they see how the Society’s irrationality could lead to their folly. They want to avoid at all costs the destruction of the current social order and reverting to a time when war raged among the Earth, even if this means relying on Machines and robots.
Themes
Irrationality, Fear, and Folly Theme Icon
Calvin offers an alternative theory to Byerley. The Machines follow the First Law, and work for the good of humanity as a whole. If the Machines were to be destroyed, the Earth would come to great harm. Therefore, they have been “quietly taking care of the only elements left that threaten them.” The Machines have been disrupting people and companies that threaten them, so that they can remain in control of the economy.
Calvin clarifies the real motivations behind the Machines: they are still bound to the ethics with which they have been programmed, but they recognize that their control of the economy is of paramount importance to the good of humanity, and therefore that they must do a little harm to do a lot more good.
Themes
Morality and Ethics Theme Icon
Human Superiority and Control Theme Icon
Quotes
Byerley is stunned, wondering if this means that mankind has lost its say in its future. Calvin responds that humanity was always at the mercy of social and economic forces it did not understand. Now the Machines understand those forces, and perhaps it is a good thing that they are in control, because they are looking out for humanity. Perhaps, she says, all conflicts “are finally evitable.”
With this final development, Asimov proves how humans are doomed to lose control of their creations. They are now subject to the machines that run their economy. However, in contrast to many other literary and pop-cultural portrayals of robots, Calvin argues how this can be a good thing, because the Machines are so invested in the good of humanity, and are more ethical than humans are.
Themes
Morality and Ethics Theme Icon
Human Superiority and Control Theme Icon
Quotes
Calvin concludes her tales to the reporter, explaining that she saw everything: from the time when robots couldn’t speak, to the end, when they stood “between mankind and destruction.” She tells the reporter that he will see what comes next. They do not speak again, and Calvin dies at the age of 82.
Calvin has seen the advancement of so much technology, observing robots that pass as humans and that can control humans’ fates, despite humans’ best intentions that they not be replaced or subject to robots. The book provides a counterargument to the mainstream narrative of the 20th century, which tended to fear fast-developing technology. Instead, Asimov shows how it might actually be a good thing to let technology take the reins, as long as the intention behind the programming is to improve mankind’s condition.
Themes
Human Superiority and Control Theme Icon
Artificial Intelligence, Consciousness, and Humanity Theme Icon