e-flux Conversations has been closed to new contributions and will remain online as an archive. Check out our new platform for short-form writing, e-flux Notes.

e-flux conversations

To Combat Ubiquitous Surveillance, Make Privacy Public

Part-DV-DV169532-1-1-0

In Real Life magazine, technology writer L. M. Sacasas suggests that our tendency to shrug and just going along with ubiquitous surveillance—from social media, our phones, and home devices like Amazon Echo—can be attributed in part to our narrow understanding of privacy. As Sacasas notes, most people understand privacy as an individual matter. If we want to protect our privacy we take individual measures, like turning off our phones or anonymizing our internet surfing. But Sacasas argues that we should instead think of privacy as a public, collective concern. The erosion of privacy and the inexorable creep of surveillance creates a society more amenable to obedience and control, and this affects everyone regardless of the individual privacy measures they take. Here’s an excerpt from his piece:

[As Brett Frischmann and Evan Selinger point out in Reengineering Humanity], the most serious threats digital technologies pose are not strictly personal concerns like identity theft or companies’ surreptitiously listening in on conversations but the emergence of a softly deterministic techno-social order designed chiefly to produce individuals that are its willing subjects. They note, for example, that when a school deploys fitness trackers as part of its physical education program, privacy concerns should extend not only to questions of students’ informed and meaningful consent. Even if consent is managed well, such a program, Frischmann and Selinger argue, “shapes the preferences of a generation of children to accept a 24/7 wearable surveillance device that collects and reports data.” This is to say that these programs contribute to “surveillance creep”: our gradual acquiescence to the expanding surveillance apparatus. Such an apparatus, in their view, appears pointed ultimately toward the goal of engineered determinism. Frischmann and Selinger conclude by advocating for legal, cultural, and design strategies that aim at securing our freedom from engineered determinism. And I would suggest that we would do well to reframe our understanding of privacy along similar lines.

A better understanding of privacy does not merely address the risk that someone will surreptitiously hear my conversations through my Apple Watch. Rather it confronts the risk of emerging webs of manipulation and control that exert a softy deterministic influence over society. The Apple Watch (or the phone or the AI assistant or the Fitbit) is just one of many points at which these webs converge on individuals. Tech companies, who have much to gain from the normalization of ubiquitous surveillance, have presented their devices and apps as sources of connection, optimization, convenience, and pleasure. Individualized understandings of privacy have proved inadequate to both perceiving the risks and meeting them effectively.

Image via Yahoo News.