Weapons of Math Destruction

by

Cathy O’Neil

Weapons of Math Destruction: Introduction Summary & Analysis

Summary
Analysis
As a young girl, author Cathy O’Neil was a self-described “math nerd.” She loved math because it was simple and neat when so much of the world was messy. While majoring in math in college and earning a PhD in algebraic number theory, O’Neil enjoyed adding to the field of mathematics, helping to expand its bounds.
By introducing herself as a person whose life has centered around math, Cathy O’Neil establishes both her credibility as a mathematician and her deep investment in making sure that mathematics is used to improve the world.
Themes
Humanity vs. Technology  Theme Icon
After teaching math at Barnard College for several years, O’Neil left academia for a new “laboratory”—the global economy. As a “quant” (quantitative analyst) for D.E. Shaw, a major hedge fund, O’Neil was amazed by how the operations she and her team performed each day translated into “trillions of dollars sloshing” between accounts. But in the fall of 2008, after just a year at the company, everything changed—the financial crisis brought the economy to a halt.
O’Neil hoped to use math to educate people and help them live sustainable lives. But the longer she worked with math, the clearer it became that math was an extremely powerful tool that could change the world—or derail it.  
Themes
Humanity vs. Technology  Theme Icon
The financial collapse was made possible by people like O’Neil—mathematicians who had multiplied the “chaos and misfortune” of the crisis by misusing math. But rather than taking a step back after the crisis, people instead doubled down on new mathematical techniques—and mathematicians began studying people’s desires, movements, and spending habits, calculating each human’s potential as “students, workers, lovers, criminals.” This, O’Neil writes, is now known as the Big Data economy.
O’Neil implies that the financial crisis should have caused people to reckon with how powerful a force math was. But instead of stopping to think about math’s role in the financial crisis, corporations only barreled ahead with their use of complex mathematics in everyday operations. This situation introduces the tension between humanity and technology: without a human element to “Big Data” and technology, mathematical models threaten to dehumanize people by turning them into nothing more than statistics.
Themes
Humanity vs. Technology  Theme Icon
But around 2010, as Big Data saw mathematics involving itself in human affairs like never before, O’Neil began to feel troubled. People, after all, were imperfect and fallible—and the math-powered algorithms and models that were now powering the data economy had been encoded with their human creators’ prejudice and biases. Yet Big Data seemed unimpeachable, even as it further deepened the wealth divide in global society. O’Neil calls these harmful models Weapons of Math Destruction—WMDs for short.
O’Neil suggests that mathematical algorithms and models aren’t necessarily more objective than people are—after all, these algorithms are created by humans who are naturally biased. To better explain this idea, O’Neil introduces the concept of a “weapon of math destruction”—a mathematical tool that has the power to create widespread chaos in society by increasing disparities between different sexes, races, and classes of people. 
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
Quotes
Get the entire Weapons of Math Destruction LitChart as a printable PDF.
Weapons of Math Destruction PDF
One case of a WMD that began with an admirable goal but quickly became destructive started in 2007 in Washington, D.C. The city’s new mayor, Adrian Fenty, dedicated himself to turning around underperforming schools throughout the city. Michelle Rhee, the chancellor of schools whom Fenty appointed, developed a teacher assessment tool called IMPACT that would use data to weed out low-performing teachers.
This passage introduces the idea that a WMD can have noble origins. D.C. officials just wanted to reform their school system—but O’Neil alludes to the fact that in the process, they created a situation in which a flawed mathematical model did more harm than good.
Themes
Humanity vs. Technology  Theme Icon
At this time, Sarah Wysocki was a fifth-grade teacher in Washington, D.C. who was beloved by her students and had consistently gotten great performance reviews from her school’s principal—but she received a terrible score on her IMPACT evaluation. Because of IMPACT’s algorithm, she was fired along with over 200 others area teachers. The algorithm had promised fairness—but Wysocki felt that the numbers were anything but fair.
While algorithms are often incredibly efficient, they’re not always fair. In this case, an algorithm that was meant to make the D.C. school system better was inherently flawed. And because it wasn’t regulated, it ended up having unfair consequences for teachers who performed well in the classroom.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
The Princeton-based Mathematica Policy research had come up with the IMPACT evaluation system. They knew that measuring students’ educational progress was a complex issue (and tried to make their algorithm complex, too). But they couldn’t pinpoint how much of any given student’s struggles in school were the result of outside factors like poverty, trouble at home, or social issues at school—and how much they reflected a bad teacher.
Algorithms and models can’t solve every problem—especially when a problem is complicated and rooted in human struggles. While a model can produce data about a student’s test scores, it can’t know that student’s individual strengths and weaknesses—nor can it evaluate how those factors are being handled in the classroom.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
There were too many factors that go into the process of teaching and learning, Wysocki argued, to create an algorithm that would quantify them. Without huge collections of data points or feedback to warn statisticians when they’re off-track, data models can become self-perpetuating produce results that don’t necessarily tell the whole story—but that confirm whatever the statisticians set out to prove. This dynamic is highly destructive.
Wysocki essentially lost her job because an algorithm detected that her students weren’t scoring high enough. This might have been efficient in terms of removing teachers whose students weren’t really thriving—but O’Neil implies that it wasn’t fair or even effective at solving the problem of low-performing students.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
The “effectiveness” of the model that Mathematica used to weed out D.C.’s lowest-scoring teachers seemed unimpeachable. But in fact, it was an example of a WMD feedback loop—a situation that takes place when models “define their own reality and use it to justify their results.” Other examples of this include employers who use credit scores in the hiring process, believing that responsible people have higher credit and are thus more likely to do well in a role. As a result, affluent people get jobs, while poor people get caught in a downward spiral. WMDs are dangerous because they’re engineered to evaluate large groups of people, but the criteria they use to make such sweeping judgments are unknowable to everyone but their creators.
By claiming that models often create results that support their purpose, O’Neil suggests that these algorithms are effective only on their own terms. They’re not fair or even particularly useful in securing accurate data and results, because they “define their own reality” by cherry-picking certain data points . So, these “weapons of math destruction” are creating widespread damage by deepening social inequality .
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon
At the start of her last year at MacFarland Middle, Sarah Wysocki saw that many incoming fifth graders from a nearby elementary school had scored well on their tests. But when they arrived in her classroom, they struggled to read simple sentences. Later, investigations by major newspapers would reveal that there was a lot of cheating on these exams—but the students weren’t the ones cheating. Their teachers, motivated by the fact that higher student test scores would reward their own performance in the eyes of evaluation algorithms, were correcting their tests for them. Wysocki would later become convinced that she herself was a victim of other teachers’ desperate actions in the face of a WMD. The human victims of WMDs, O’Neil argues, are held to higher standards than the algorithms themselves.
WMDs can create an environment where people take desperate measures in order to avoid the algorithms’ harsh judgments. In this example, scoring systems that were supposed to be evaluating student success caused fear and stress for teachers, to the point that they felt compelled to lie and cheat on behalf of their students. In this way, the standardized tests were doing the opposite of what they were created to, and they harmed both students and teachers in the process. This illustrates how destructive unregulated WMDs can be.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Quotes
In 2011, O’Neil quit her job at Shaw and joined an e-commerce start-up as a data scientist. But she was disheartened to find that WMDs were, by now, at the heart of every industry—and they were deepening inequality everywhere. Scandalized and outraged, Shaw started taking action: she launched a blog that would expose how bad statistics and biased models were creating dangerous feedback loops. She also joined Occupy Wall Street and began speaking at the Alternative Banking Group at Columbia University, advocating for financial reform.
This passage illustrates O’Neil’s investment in combating WMDs across multiple professional sectors, and in her personal life as well. O’Neil has long advocated for fairness, direct human involvement, and regulation in the use of mathematical modeling.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Yet mathematical models still control many different sectors—from advertising to schools to prisons. Models and algorithms and software only exist to grow revenue. Profits of any kind, O’Neil argues, are “serving as a stand-in […] for the truth.” WMDs are engineered to make money or to create clout, and they ignore the people they hurt in the process—people like Sarah Wysocki. O’Neil announces her intent to take her readers on a tour of “the dark side of Big Data” and examine the injustices that WMDs cause as they control most aspects of modern life.
Big Data and the use of mathematical modeling are changing how we live. Yet people are being victimized by algorithms that prioritize profits over truth and efficiency over fairness and equality. If WMDs aren’t regulated—let alone called out for what they are—society may become more divided and unjust.
Themes
Humanity vs. Technology  Theme Icon
Discrimination in Algorithms  Theme Icon
Fairness vs. Efficiency  Theme Icon