Weapons of Math Destruction

by

Cathy O’Neil

Weapons of Math Destruction: Chapter 10: The Targeted Citizen Summary & Analysis

Summary
Analysis
O’Neil imagines creating a petition for tougher regulations on WMDs and posting it to Facebook. As soon as she hits “send” on the post, the petition belongs to Facebook. The site’s algorithm gets to decide what to do with it and whom to show it to, based on the data it has about each of O’Neil’s “friends.” For many of them who don’t engage with many posts and who never circulate petitions, it’ll get buried in their feeds—but for others, it’ll pop up at the top. Facebook isn’t the “modern town square” it might seem to be. Its powerful algorithms and news-molding infrastructure have allowed it, in many ways, to “game” the U.S. political system.
By illustrating how Facebook’s selective algorithms work, O’Neil shows how models are eroding transparency. Facebook isn’t a “town square” where people can gather to, openly discuss the same information. Rather, because Facebook hides certain things from certain people, it’s creating an uneven and contentious media landscape. And this is dangerous when Facebook applies its algorithm to news, because it can influence people politically.
Themes
Humanity vs. Technology  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
During the 2010 and 2012 U.S. elections, Facebook created experiments to hone a tool called the “voter megaphone” that would allow people to spread the word about voting. Facebook was encouraging over 61 million American users to get out and vote by leveraging peer pressure against them. At the same time, they were studying how different types of updates influenced voting behavior. Because the profits of companies like Facebook, Google, Apple, Microsoft, Amazon, and Verizon are heavily regulated by government policies, these companies often spend a lot of money lobbying and donating to the political system. Now, they can influence Americans’ political behavior and, as a result, the shape of American government.
In today’s world, technology and data companies have a say in U.S government. Now that these companies hold political sway, they’re coming up with algorithms and models that will more clearly show them just how much influence they possess—and how directly they can change how people participate in civic life.
Themes
Humanity vs. Technology  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Quotes
Facebook’s grand experiment with the voter megaphone showed that they increased turnout by nearly 350,000 people—a big enough group to swing whole states. Facebook used the initial 2010 experiment to study how our friends’ behavior impacts our own—and in 2012, they took things a step further. For two million politically engaged users, researchers tweaked the algorithm to show these people more news instead of social updates or funny videos. Researchers wanted to see if getting news from friends would change people’s political behavior—and it did. Voter participation in the group went up by three percent.
Algorithms can change the way people see the world—and thus how they participate in society. People will respond to information they receive from outlets they trust, and Facebook’s experiment showed how powerful algorithms are in manipulating this trust. Although increasing voter turnout is a positive result in theory, O’Neil implies that readers should be alarmed about the fact that tech companies have the ability to change outcomes in elections.
Themes
Humanity vs. Technology  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Editors at news outlets, of course, decide what their readers see, and from what angle they see it. But when a news outlet covers a story, everyone can see it. When Facebook delivers news, the process is mysterious—it’s not the “neutral go-between” it might seem to be. Most users are unaware that the company is tinkering with their news feeds and filtering what they see (and which of their posts their network can see). Research indicates that showing Facebook users positive updates puts them in better moods, while showing them negative updates from friends puts them in worse ones. Emotional states can be transferred through the internet—and through Facebook’s algorithms.
Again, O’Neil is using Facebook’s information delivery model as an example of how powerful algorithms can be. These models have the power to change people’s opinions and even their moods. This poses a problem because Facebook’s algorithms are hidden from the public, so users are being manipulated without their consent.
Themes
Humanity vs. Technology  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Get the entire Weapons of Math Destruction LitChart as a printable PDF.
Weapons of Math Destruction PDF
O’Neil doesn’t believe that Facebook’s researchers are actively trying to game the political system. But she does believe that Facebook has the power to determine what people learn about the political system, how they feel about it, and how they participate in civic life as a result. Facebook and Google haven’t yet turned their algorithms into political WMDs, but the “potential for abuse is vast.”
Facebook might not be actively trying to change how its users think and feel, especially where politics are concerned. Nevertheless, Facebook’s algorithms do have enormous influence over their users—so the company needs to make sure it doesn’t misuse that influence. 
Themes
Humanity vs. Technology  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
In the spring of 2012, Mitt Romney had all but secured the Republican presidential nomination. He traveled to Florida for a fundraiser at the home of Marc Leder, an investor who’d given over $300,000 to Romney’s campaign. Romney assumed he was walking into a closed setting with a likeminded group of people—but as he let loose during his speech with traditional Republican talking points, he underestimated his audience. One of the caterers captured Romney claiming that 47 percent of the U.S. population were nothing but “takers,” posted it to the internet, and let the world see.
This passage is an example of a real-life model in action. Romney had developed a model for how he’d speak to attendees at Leder’s gathering—but he’d failed to account for a crucial piece of data (the demographics of everyone in attendance), and so his model backfired.
Themes
Data, Transparency, and U.S. Democracy Theme Icon
Most politicians tailor their pitches for lots of different subgroups on the campaign trail. This is, essentially, a form of modern consumer marketing that’s driven by carefully gathered data. In the incident at Leder’s house, Romney was speaking based on one set of data—but he ignored that other groups might be in attendance.
Politicians can use data about potential voters to help them get ahead, but doing so can also hinder their campaigns. This illustrates how crucial it is for data to be thoroughly vetted before it’s used to model an outcome—especially when that data is related to politics.
Themes
Data, Transparency, and U.S. Democracy Theme Icon
Nowadays, Big Data has given politicians lots of powerful tools for targeting “micro-groups” of citizens for votes and donations through carefully honed messages. In July of 2011, the Obama campaign started hiring analytics experts who would help create and target groups of like-minded voters. Rayid Ghani, one of the campaign’s data scientists, had previously worked on projects for a consulting that analyzed grocery stores’ consumer data. This information was used to create customized shopping plans for many kinds of shoppers: coupon-clippers, brand loyalists, foodies, and so on. Now, Ghani was trying to see if similar calculations would work on swing voters (those who aren’t firm supporters of any one candidate or political party).
The Big Data economy isn’t just changing how companies work with their employees, how colleges handle admissions, or how insurers decide whom they should cover. It’s changing political and democratic institutions all over the world, as well. Algorithms play a significant role in contemporary politics, because faulty data can now quite literally decide the fate of an entire country. This is a dangerous precedent to set as tech becomes a bigger part of everyday life with each passing year.
Themes
Humanity vs. Technology  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
After conducting deep interviews with a few thousand people from different groups, Rayid and his team set out to find voters who resembled them by sifting through the data and demographics of the people they’d interviewed and creating mathematical profiles of them. Then they scoured national databases to find people with the same profiles so that they could target each group with advertisements to see what metrics appealed to them. By the end of all these tests, the researchers had their group of 15 million swing voters.
By finding similar people based on users’ voluntarily submitted data, Rayid’s team was essentially creating proxies. From this, they were able to reach a massive audience just by extrapolating information about a small group, whether that information was relevant to everyone in their target audience or not. This illustrates how readily available—and yet potentially useless—proxy data is.
Themes
Humanity vs. Technology  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Four years later, Hillary Clinton’s campaign would build on the Obama research team’s methodology to create a data system that would let them manage millions of voters. But many of the methods used to gather data that campaigns use aren’t necessarily legitimate. Data firms keep tabs on users’ “likes” and use those to rank them on the scale of “big five” personality traits (openness, conscientiousness, extroversion, agreeableness, and neuroticism). Then they develop targeted ads based on this information. Not all of these methods are useful—but some are, and they’ve essentially turned the voting public into a kind of fluctuating financial market.
Political campaigns are now treating voters like a tech company might treat its users—by appealing to their lowest common denominator. This, in O’Neil’s estimation, is a dangerous new development. Political campaigns should be held to a higher standard than leisure pursuits like social media platforms, because they’re deeply intertwined with the U.S.’s democratic process.
Themes
Data, Transparency, and U.S. Democracy Theme Icon
Lobbyists and interest groups, too, use these microtargeting tactics to reach more people. This can be dangerous—certain political groups can create targeted advertisements full of false information that spread rapidly. The anti-Obama “birther” campaign got off the ground this way, as has a lot of misinformation about things like abortion and immigration. Even television is moving toward personalized or microtargeted advertising—and as individualized ads become more common, it’ll be harder to understand or even access what our neighbors and friends are seeing. Political marketers have scores of information about us, but we have little about them or their methods.
Microtargeting allows political campaigns to reach more voters—but it also means that the approach behind microtargeting can be used to spread false information. Marketers from all sectors of the economy know that appealing to people based on data-backed research is a good way to bring in new users, new customers, or new voters. But the algorithms that enable that efficient approach are also being used to undermine facts and democracy.
Themes
Humanity vs. Technology  Theme Icon
Fairness vs. Efficiency  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon
Quotes
Microtargeting is vast, largely hidden, and unaccountable or unregulated—so it is, in O’Neil’s estimation, a WMD. And it’s actively undermining and threatening U.S. democracy. Additionally, what’s so frightening about political microtargeting is that it’s not aimed only at the rich or the poor—it’s aimed at everyone except for those who aren’t expected to vote. This creates a feedback loop that keeps uncertain or disenfranchised voters out of the civic system.
Like many other WMDs, political microtargeting could be used to help and educate people—but instead, it’s used to manipulate and mislead them. O’Neil implies that humanity needs to swiftly and carefully begin regulating WMDs so that they won’t be used to threaten the truth and destabilize democracy in the U.S. (or abroad).
Themes
Humanity vs. Technology  Theme Icon
Data, Transparency, and U.S. Democracy Theme Icon