This article was originally published in The Atlantic on May 15, 2017.
Internet service providers have realized that they are sitting on a treasure chest of data about your online activities that they could be selling to advertisers.
Recognizing the privacy threat, the Federal Communications Commission adopted rules that would have stopped them from doing so without your consent, but Congress recently shot down the regulation.
This is a big deal.
As part of the Princeton Web Transparency and Accountability Project, we’ve been studying who tracks you online and how they do it. Here’s why we think the fight over browsing histories is vital to civil liberties and to a functioning democracy.
Privacy doesn’t merely benefit individuals; it fundamentally shapes how society functions. It is crucial for marginalized communities and for social movements, such as the fight for marriage equality and other once-stigmatized views. Privacy enables these groups to network, organize, and develop their ideas and platforms before challenging the status quo. But when people know they’re being tracked and surveilled, they change their behavior. This chilling effect hurts our intellectual freedoms and our capacity for social progress.
The data that tracks our behavior feeds into machine-learning algorithms that make judgments about us. When used for advertising, they can reproduce our own prejudiced behavior. Latanya Sweeney, the director of the Data Privacy Lab at Harvard University, found that Google searches for black-sounding names more often resulted in ads for arrest records compared to searches for white-sounding names, likely a result of the algorithm learning to predict what users are likely to click on.
Marketers can also use machine learning to figure out your unique quirks—do you respond better to words or to pictures? Do you make impulsive shopping decisions?—to target you with exactly the advertisement that will best persuade you.
When consequential decisions about employment or loans are made using this kind of data, the result can feel Kafkaesque, because these systems aren’t programmed to explain their decisions. There aren’t yet effective ways for humans to hold algorithms accountable for how they categorize us. And when algorithms learn what we like and feed us more of it, they amplify the notorious filter bubble and deepen political polarization.
Web tracking today is breathtaking in its scope and sophistication. There are hundreds of entities in the business of following you from site to site, and popular websites embed about 50 trackers on average that enable such tracking. We’ve also found that just about every new feature that’s introduced in web browsers gets abused in creative ways to “fingerprint” your computer or mobile device. Even identical looking devices tend to behave in subtly different ways, such as by supporting different sets of fonts. It’s as if each device has its own personality. This means that even if you clear your cookies or log out of a website, your device fingerprint can still give away who you are.
Worse, the distinction between commercial tracking and government surveillance is thin and getting thinner. The satirical website The Onion once ran a story with this headline: “CIA's ‘Facebook’ Program Dramatically Cut Agency's Costs.” Reality isn’t far off. The Snowden leaks revealed that the NSA piggybacks on advertising cookies, and in a technical paper we showed that this can be devastatingly effective. Hacks and data breaches of commercial systems have also become a major part of the strategies of nation-state actors.
The good news is how effective technology can be in preventing tracking. We found that ad blockers and other browser-privacy tools can decrease tracking by 80 percent or more. More complex tools such as the Tor browser can be even more effective. In other words, the more technically savvy among us can enjoy dramatically better privacy and digital freedoms. But this has resulted in a technological “arms race,” which is worrying by itself, but also because such technical skill correlates with historically advantaged groups. Meanwhile, publishers are caught in the ad-blocking crossfire, endangering the free press.
One bright spot is that online privacy research has had a tremendous effect. It has helped regulators curb the worst of the offenses, forced companies to roll back incursions because of public-relations pressure, spurred the development of privacy tools, and fomented a healthy public debate about online tracking. The fight for privacy is now inextricably linked to the fight for digital civil liberties and democratic values, and it is a movement that includes activists, artists, journalists, researchers, and everyday users of technology. There’s tremendous power in your hands to take charge of your own privacy as well as foster these societal values.
Arvind Narayanan is an assistant professor of computer science at Princeton University.
Dillon Reisman is an independent researcher who works with the Princeton Web Transparency and Accountability Project.
This article is part of The Democracy Project, a collaboration with The Atlantic.