Weapons of Math Destruction

by

Cathy O’Neil

Weapons of Math Destruction: Conclusion Summary & Analysis

Summary
Analysis
WMDs cause destruction and chaos throughout society: in public schools, colleges, courts, workplaces, voting booths, and more. But it’s too late to disarm these weapons one by one—they all feed into one another. Data encourages companies to send people predatory ads. It also encourages police to go into vulnerable neighborhoods, and then it influences the courts to give the people whom police arrest longer prison sentences. All this data tells other WMDs that these people are high risks—so they’re blocked from jobs and watch helplessly as their interest and insurance rates ratchet up. All the while, WMDs keep the wealthy and comfortable in silos of their own, ignorant of others’ suffering. WMDs are part of a “silent war.” 
Here, O’Neil walks readers through the interconnected nature of the processes behind applying to schools and jobs and seeking credit and insurance. In doing so, she’s illustrating the reason that WMDs are so dangerous: because the data used to create them now feeds multiple systems at once. It’s hard to disengage from WMDs because they’re virtually everywhere.  
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Corporations can right wrongs in their algorithms—for instance, even though President Bill Clinton signed the Defense of Marriage Act into law in 1996, IBM promised a week later to extend benefits to its employees’ same-sex partners. They did so not necessarily because of morality, but because other tech giants were already doing so, and they didn’t want to lose employees to competitors. So, in a bid to attract a growing talent pool of LGBT workers, IBM corrected an unfairness.
Here, O’Neil shows how a simple change in policy at a crucial moment had a ripple effect. Even though IBM made this particular move to maximize their efficiency and competitiveness in the market (but did so in the name of fairness), there’s still room for major corporations to make big changes in the name of equity and objectivity.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
In that scenario, everyone won—but companies aren’t always so incentivized to dismantle their WMDs. Many WMD victims are the most voiceless and disenfranchised: the poor, the incarcerated, the vulnerable. These easy targets are where all WMDs start operating. But it won’t be long, O’Neil predicts, before they evolve and spread, targeting the middle and upper classes as they search for new opportunities.
Just because predatory college loans or sky-high insurance rates are targeted at working-class people, it doesn’t mean that corporations won’t start aiming their WMDs at other groups. If we don’t stand up for the vulnerable, O’Neil is saying, soon we’ll all be victims of predatory WMDs.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
The main difference between the WMDs of the present and the prejudiced human errors of the past is simple: humans can evolve, learn, and adapt. But automated systems are stuck in time—engineers have to change them as society progresses. So essentially, “Big Data processes codify the past” rather than inventing the future. Only humans have the “moral imagination” needed to create a better world. O’Neil asserts that humanity is in the throes of a new kind of industrial revolution—and it is urgent that we learn from the mistakes of the last one, which exploited workers and endangered lives in the name of profit.
Even though algorithms and machines can “learn” in a sense, they’re simply not human—they can’t imagine things, they don’t have a moral compass, and they are only as good as their creators. But humanity has the capacity to understand the stakes of its present moment—and so humans, not models, should be the ones in charge of humanity’s most important processes and questions.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Quotes
Get the entire Weapons of Math Destruction LitChart as a printable PDF.
Weapons of Math Destruction PDF
We need to regulate the mathematical models that increasingly run our lives, and we must start with the modelers themselves. Like doctors who swear to the Hippocratic Oath before obtaining their medical licenses, O’Neil suggests, data scientists need to abide by certain moral codes and strictures that prevent them from doing harm to others. Regulating WMDs would be difficult and deeply involved—but O’Neil argues that even if it comes at a cost to efficiency, we must start to “impose human values” on WMDs and “get a grip on our techno-utopia.”
Right now, the tech world and the Big Data economy are uncharted territories in many ways. Those in charge are looking to maximize profit and influence as wide of an audience as possible—and there need to be some checks and balances in place. Without human regulation on these incredibly powerful technological tools, society will become more stratified, democracy will come under threat, and our “utopia” may soon devolve into a dystopia at the hands of predatory technology.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
In order to disarm WMDs, we must admit that they can’t do everything. We must measure their impact by auditing their hidden algorithms and studying their biases and shortcomings. Unfair systems, like the value-added model used to score teachers, must be dispatched entirely. Rather than letting negative feedback loops slip through the cracks, analysts must figure out how WMDs can create positive feedback loops that change lives and benefit society. While some algorithms, like Amazon and Netflix’s, should be allowed to sort the kinds of entertainment people enjoy, recidivism models and other algorithms used in the justice system must be held to unimpeachable standards—even if it means revising them and changing their inputs altogether. 
Here, O’Neil suggests that there should be different standards for algorithms in different sectors of modern life. There are models that are relatively simplistic and that don’t require much oversight, like the algorithms on streaming platforms that suggest programming based on what users have watched before. But when it comes to the education, the justice system, and politics, there needs to oversight and regulation. This is because if data is used the wrong way in these arenas, it could threaten social stability on a large scale. 
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Not all potential WMDs are nefarious. But the point is that we need analysists and auditors to maintain the systems that govern our lives and make them more transparent. Internal audits alone aren’t enough, O’Neil states, because companies that examine their own algorithms can’t be held accountable. Outside input is needed to make sure that companies like Google and Facebook stay in line. And regulations and transparency are needed in peer-to-peer lending, healthcare and health insurance, and credit score models.
The technology to make the Big Data economy more transparent exists—we just need to start using it. Even though data is being used in legitimate and transparent ways in many sectors, the potential for harm is too great to allow technology to influence our lives without any protective measures in place.
Themes
Humanity vs. Technology  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
In 2013, O’Neil began working as an intern at New York City’s Housing and Human Services Departments—she wanted to build the opposite of a WMD, a model that would help stop houseless people from getting pushed back into shelters and help them finding stable housing. Her and her team’s research found that families who received Section 8 affordable housing vouchers were less likely to return to shelters. But the city government was trying to get families away from Section 8 to a new program, Advantage, that limited subsidies to a few years. Public officials were not happy to hear about O’Neil’s team’s research.
O’Neil’s experience with a branch of the New York City government shows that even public service organizations are using technology and data to exploit people rather than helping them prosper. The NYC Housing and Human Services Departments were using technology to prey on people’s desperation and deepen class divides rather than using the data they’d gathered to actually change lives.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
 But Big Data, O’Neil asserts, should be disruptive when it comes to things that actually matter, like human rights. There are so many mathematical models out there today, O’Neil writes, that could be used to do good—but instead, they often wind up being abused. Yet there’s hope in the form of supply chain models that seek out sweatshops and other places where slave labor is being used to build products, and predictive models that try to pinpoint houses where children are more likely to suffer abuse.
The models O’Neil describes here have the potential to really help people—but if placed in the wrong hands and stripped of their transparency, they could easily become WMDs. She’s underscoring the importance of using innovative technology to help people rather than exploit them.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
O’Neil hopes the WMDs that are around today will soon become relics of the past. She hopes that we can learn from our present moment—the early days of a “new revolution”—and learn to bring transparency, fairness, and accountability to the age of Big Data.
Humanity is indeed in the midst of a “new revolution”—and the Big Data economy offers lots of opportunities for social change, economic reform, and a more egalitarian society. But if used incorrectly, WMDs could actually erode democracy and create more social division. So, humanity needs to recognize the weight of our present moment and rigorously ensure that algorithms and the technology they power are objective, fair, and reliable.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Quotes