No Exit: The Digital Edition

AP Images/Weng Lei

Privacy advocates say we should care about privacy because its erosion threatens liberty. "A human being who lives in a world in which he thinks he is always being watched is a human being who makes choices not as a free individual but as someone who is trying to conform to what is expected and demanded of them," Glenn Greenwald said in an interview. His statement echoes staunch privacy defenders of yore, like Justice Louis Brandeis, who described privacy as “the most comprehensive of rights and the right most valued by civilized men.”

The public outrage that followed revelations about mass surveillance of citizens by the National Security Agency suggests many Americans agree. But appealing only to the ethical justifications for privacy won’t be enough to spur the rescue of this right. For one, it’s not clear how the visceral want for privacy translates into actual rules and policies in the digital age, especially when surrendering personal data just seems like its entrance fee. It’s discomforting that a handful of government and corporate actors know so much about us—and yet if we can’t conceive of tangible ways in which surveillance actively harms us, is our alarm misplaced?

Julia Angwin’s Dragnet Nation answers this by documenting the facts and clarifying the stakes. In her new book Angwin, a former Wall Street Journal reporter now with Pro Publica, identifies how companies and governments are using our personal data and details its effects on ordinary people. Her account grounds abstract notions of privacy in concrete stories, like that of Sharon Gill and Bilal Ahmed, whose online conversations about mental illness were monitored and sold to pharmaceutical companies, or of Yasir Afifi, who discovered his phone calls, e-mails, and car were being tapped by the FBI because of one rogue comment his friend posted on Reddit.

Angwin traces the birth of our dragnet nation to the early 2000s, when the attacks of September 11 and the bursting of the dot-com bubble led both the government and Silicon Valley to start collecting reams of personal data—the government to expand its intelligence reach, and tech companies to pursue a new business model. Before 2001 the off-the-shelf surveillance market, made-up of hacking tools that intercept communications, track locations, or monitor browsing, barely existed. Today it pulls in close to $5 billion a year. Disentangling the surveillance industry reveals a suite of sub-industries—data brokers, analytics firms, online ad companies, businesses that track mobile locations—and companies whose names most of us have never heard (BlueKai, PYCO). That they are invisible to us belies how intimately familiar they are with our lives.

For example, data brokers can identify, among other facts, every address you’ve lived at, your education level, approximate income, ethnicity, political affiliation, stock portfolio, the age of your children, and the car you own. They use our browsing histories to sort us into target categories ranging from “married sophisticate” to “rural and barely making it.” As Angwin’s account makes clear, this knowledge is not just creepy—it represents a gaping and growing informational asymmetry between individuals and the institutions tracking them. “Anyone who holds a vast amount of information about us has power over us,” Angwin writes. “[P]eople who hold our data can subject us to embarrassment, or drain our pocketbooks, or accuse us of criminal behavior.”

One baleful way companies can harness this power is to show us only tailored prices and goods. Angwin calls this a “hall of mirrors,” where all of the online offers and prices you see reflect not just your interests but what a company approximates you can afford. Until recently most online advertising was pretty crude—if you searched for boots on Amazon, ads for boots would follow you around—but firms have innovated more sophisticated uses, strategies that retailers in the physical world are now adopting too.

For example, Angwin’s team at the Wall Street Journal revealed that Orbitz steers Mac users to pricier hotels. It also uncovered that a stapler on cost more if you were browsing from a zip code with fewer rival stores, which meant that on average, low-income areas paid higher prices. Meanwhile a study by Benjamin Shiller, an assistant economics professor at Brandeis University, found that Netflix could use personalized pricing to charge some people twice as much as others for the same product. The practice would raise profits at Netflix by 12.2 percent.

Ryan Calo, assistant professor at Washington University School of Law, predicts that companies will soon adjust offers and prices based on when we are most vulnerable. A working mother might be charged more when she buys diapers online at the end of a long day. A son looking to fly across the country to visit a sick father might face steeper ticket prices ahead of a major operation, information gleaned through his e-mail inbox.  

Exploited this way, informational asymmetries transfer wealth from the surveilled to the surveillers. This happens in two ways. Price discrimination and predatory marketing empowers companies to extract more dollars from us. And only companies capture the monetary value of our personal data, which has become the currency of using “free” services like Gmail and Facebook. Economist Joseph Stiglitz writes extensively on how growing information asymmetry feeds economic inequality, an argument echoed by thinkers like Nathan Newman, fellow at the Information Law Institute at New York University of Law, and computer scientist Jaron Lanier.

That companies are mining our personal data to quietly redraw the range of options available to us is troubling. What’s worse is there is little way to check whether the information they’re using is even accurate. Angwin repeatedly turns up data stored on her that is blatantly wrong. Institutions that rely on credit reports are required by law to inform individuals if they are denied a job, loan, or insurance because of data in the report—but nothing provides similar safeguards for other forms of data. Some sectors, like health and finance, ask data collectors to disclose how they use this information and seek user consent, but even these rules leave us exposed. For one, many of the data protection policies companies tout—like the promise that they only share information on an anonymous basis—are quite flimsy. If my online dossier is being used to sell me pricier books, do I care if it knows my mother’s maiden name?

In addition to tracking down how data is being used, Angwin attempts to escape the dragnet. Her goal is to avoid as much indiscriminate tracking as she can, while staying connected to the digital world—a quest she takes both to report on its challenges, and to protect herself as a journalist, parent, and consumer. Over the course of the book, she moves her data to an encrypted cloud service, switches to a “burner” phone, quits Google search, downloads ad blockers, adopts 18-character randomly generated passwords stored in a master password vault, and creates a fake online persona she names Ida Tarbell. Her journey validates anybody who believes that resisting online data collection is confusing and taxing. One especially grueling episode that involves two kinds of encryption software that won’t link up and multiple calls to customer service leaves her nauseous and reaching for a glass of red wine.

Angwin acknowledges that her efforts to avoid surveillance don’t get her very far. Months after resorting to a paid service that promises to remove her details from the biggest data brokers, she finds her data still appearing on people-lookup sites. “You hand over the bribe, but you’re never quite sure if it will get results,” she notes. It’s difficult to track the trackers. Still, Angwin describes her attempt as a valuable type of protest that—like desegregation sit-ins during the 1960s—could birth a movement. "The sit-ins did not immediately destroy segregation, but they led to a national conversation that ultimately unraveled it,” she writes. “My hope is that if enough people join me in refusing to consent to ubiquitous indiscriminate surveillance, we might also prompt a conversation that could unravel it."

It’s an expensive form of protest, though, one that requires time, digital literacy, and a level of technological sophistication unavailable to the vast majority of Internet users, let alone those most susceptible to the harms of surveillance. Research by Seeta Peña Gangadharan at the Open Technology Institute, for example, shows how low-income individuals, immigrants, and people of color face greater risk of data-driven discrimination.(The Open Technology Institute is a part of the New America Foundation, where I work.) When reflecting on the sundry hurdles she faced, Angwin overlooks this accompanying fact: her means of resistance excludes the very people with most reason to resist.


Ultimately, Angwin’s reporting teaches us three things.

One is that indiscriminate surveillance is giving rise to pronounced informational asymmetries that disfavor us as consumers and as citizens. The way companies can use these insights to discriminate among us can, for example, exacerbate inequality and expose minority and low-income populations to exploitation—a fact that the language of privacy doesn’t fully capture. Framing data collection primarily as a civil liberties issue implies that surveillance is only a problem if we personally believe it constrains our freedom. Establishing that there are hard political and economic consequences—and conceiving new language to encapsulate these effects—gives privacy advocates a stronger hand, and a basis for engaging people who might dismiss privacy as anachronistic.

The second is that designing new markets or technologies won’t be enough to restore a fair marketplace or let us evade surveillance. One of the ironies of the present situation is that while companies reap the value of our personal data, individuals have a hard time assigning a value to their own information, which is now dispersed among so many different brokers that gaining back a monopoly on our own data is effectively impossible. Meanwhile, Angwin’s struggle with the panoply of counter-surveillance tools—even when armed with time, some money to spare, and the help of leading data privacy experts—shows that laypeople don’t stand a fighting chance. As Angwin notes, what we will need are new laws regulating how companies can collect and use information on us. Data has been declared the “oil of the new economy,” but without any new rules it’ll remain the Wild West.

The third is that one of the primary threats we face is that privacy becomes entirely commoditized, rendering it a good to buy rather than right we all enjoy. Already the tide is pulling us in that direction: Angwin ends up paying money to protect herself (not to mention expending heaps of time), and must resist discounts from vendors looking to scoop up her personal information. Other industries are taking cue: AT&T recently slashed prices for fiber-optic customers in Austin—provided they allowed it to track their online browsing. By normalizing surveillance as the default setting, these schemes risk widening present inequalities.

Rescuing privacy will require we rethink the parameters of acceptable and unacceptable surveillance and of how the data it collects is used. The stories Angwin captures equip us to do so with greater rigor and nuance. But convincing people that privacy matters might require we not talk in terms of “privacy” at all. After all, what’s at stake is wealth and power, and whether we adopt policies that promote equality or those that allow a few to enrich themselves at the expense of the many.

You may also like