Week 1 Reading: Surveillance and Capture: Two Models of Privacy

This week we read a paper titled Surveillance and Capture: Two Models of Privacy by Philip Agre.

When I first started reading this article, I didn’t look at its publication date, but very soon started wondering. The article discussed two frameworks – the surveillance and capture models – and I at first had trouble understanding capture the way he was describing it. When I saw the date, I realized this is because he was trying to describe something that, although ubiquitous today, didn’t exist fully then and there wasn’t a standard language to describe it. There were many interesting and thought-provoking (although extremely obtuse and dense) parts of this article, but I’ll share my reactions to a couple in particular:

“But these systems all participate in a trade-off that goes to the core of computing: a computer – at least as computers are currently understood – can compute only with what it captures; so the less a system captures, the less functionality it can provide to its users.” (p.113)

I used to do public policy research, which involved designing and running large-scale studies to evaluate programs and policies related to education and youth violence prevention. We relied on large administrative datasets from the police department and the public school system. My colleagues and I used to talk about how our jobs didn’t exist when we were in college and they were only made possible by the rise of “big data” and isn’t that amazing?! But we also talked about challenges in measuring life outcomes in this way because there is so much that the data don’t include about a place or a person. Sometimes I wondered if we were measuring reality or just the reality that someone else had made available to us. Agre acknowledges this by pointing out that the capture system is a “correspondence” between reality and how we choose to represent it, something I feel isn’t talked about in the data science community enough.

“The result is a generalized acceleration of economic activity whose social benefits in terms of productive efficiency are clear enough, but whose social costs ought to be a matter of concern.” (p.122)

I was disappointed that he ended by talking about political economy implications and didn’t go into the deeper social concerns he alludes to here (although such a shift in capital generation was probably exciting and groundbreaking at the time!). It is frustrating to read this now and see that leading experts in this field understood the dangerous social implications of data systems at such an early stage. Earlier in the article, he describes electronic capture actions taking on a “performative quality” when there is an audience, which sounded like a very boring and long-winded way of describing how people use Instagram today. As I read this, I started to worry that we are close to the dystopia that he describes, but with a slight twist because in this current version we need to be surveilled because we are surveilling and capturing ourselves.