DATA-PSST Seminar 2 Position Statement

I was lucky enough to be invited along to take part in the Second DATA-PSST workshop in Sheffield last week, hosted by Vian Bakir of Bangor University. It was a very thought-provoking event and raised many issues around the ethical and technical limits of privacy, which I’m sure will be available in the report in due course.

A articular issue raised for me is the degree to which human eyes need to be involved in a surveillance practice before it constitutes “Surveillance” – I’d argue that any form of agency exercised in response to that data, whether by a human or an algorithm, has a surveillance element. An argument about mere “processing” of data is problematic since merely indexing the data for targetted search by a human operator would be processing in itself. Definitions like this are important when it comes to meaningful consent by individuals, and the idea of collective consent to being policed or governed adds a whole extra dimension to what is already a very nuanced problem!

In the mean time, this is the position statement that I submitted, based in part on the arguments outlined in “The Grey Web” paper authored with colleagues mc schraefel and Natasa Milic-Frayling. Inaccuracies or outrageous claims in this position paper are entirely my own, though.

Richard

The Web is a Surveillance Tool

Today’s web is funded primarily by advertising; the sub-millisecond delivery of targeted advertising alongside content of genuine interest to users. Networks of content providers, advertising brokers and advertisers allow private companies to record extensive amounts of web browsing history from individual web users. Our research indicates that after visiting only 30 search results there is a 99.5% chance that an individual user has been tracked at least once by each of the top ten third party tracking domains.

These private digital dossiers allow the inference of many pieces of personal information; both in practice (for the purposes of delivering targeted advertisements) and in theory (were the data to be obtained by a fourth party and put to new uses).

Through the research that we are conducting in the Meaningful Consent Project, we observe that even people in the small minority of web users that understand the mechanisms through which third party tracking operates are surprised when we demonstrate the extent of third party tracking that they are subject to. When asked to suggest information that Facebook holds about them, no participants in any of our focus groups or interviews (n ≈ 35) have mentioned data about their browsing history, which is collected via the “Like” and “Share” widgets that Facebook provides to site operators. This undermines, among other things, the notion that web users give up information as “payment” for service usage – most have never considered the data that is being collected, let alone balanced this against the value of the service that they receive. First party websites and advertisers themselves become complicit in the process of tracking their users and customers, often without a full understanding of the implications or mechanism through which the advertising networks operate.

Unlike state surveillance, which is typically intentional, deliberately engineered and subject to oversight, the development of this private surveillance infrastructure has been driven by commercial ends and without any oversight, direction or regulation. Yet, there is an unclear relationship between this organic but pervasive surveillance and the more deliberate, structured surveillance of nation states. Individual users (particularly those outside the US to whom most of the USA’s legal privacy safeguards do not apply) are left wondering about how porous the relationship between the primarily US-based third party tracking companies and the US secret services really are.

The technology that underpins this third party tracking is often either undetectable – the stateless ‘device fingerprint’ – or functionally ambiguous, by virtue of being the very same technologies that support end-users’ own legitimate aims – the stateful browser cookie that stores your shopping basket. These properties of the technology make it virtually impossible to determine the extent of the tracking that a particular user is subject to and limit the feasibility of technical countermeasures to block it. Given the ubiquity of third party tracking on today’s web, this provides a very real limit to the technical feasibility of online privacy.

Far from its initial purpose as a tool for academic collaboration, or the grand vision of an egalitarian, pro-human interchange of ideas, the Web that we have today is (at least quantitatively) primarily a surveillance tool.