Surveillance without borders
This time last year, the Computers, Privacy, and Data Protection conference was talking about inevitable technology. Two thousand people from all over the world enclosed in two large but unventilated spaces arguing closely over buffets and snacks for four days! I remember occasional nods toward a shadow out there on the Asian horizon, but it was another few weeks before the cloud of dust indicating the coronavirus's gallop westward toward London became visible to the naked eye. This week marks a year since I've traveled more than ten miles from home.
The virus laughs at what we used to call "inevitable". It also laughs at what we think of as "borders".
The concept of "privacy" was always going to have to expand. Europe's General Data Protection Regulation came into force in May 2018; by CPDP 2019 the conference had already moved on to consider its limitations in a world where privacy invasion was going physical. Since then, Austrian lawyer Max Schrems has poked holes in international data transfers, police and others began rolling out automated facial recognition without the least care for public consent...and emergency measures to contain the public health crisis have overwhelmed hard-won rights.
This year two themes are emerging. First is that, as predicted, traditional ideas about consent simply do not work in a world where technology monitors and mediates our physical movements, especially because most citizens don't know to ask what the "legal basis for processing" is when their local bar demands their name and address for contact tracing and claims the would-be drinker has no discretion to refuse. Second is the need for enforcement. This is the main point Schrems has been making through his legal challenges to the Safe Harbor agreement ("Schrems I") and then to its replacement, the EU-US Privacy Shield agreement ("Schrems II"). Schrems is forcing data protection regulators to act even when they don't want to.
In his panel on data portability, Ian Brown pointed out a third problem: access to tools. Even where companies have provided the facility for downloading your data, none provide upload tools, not even archives for academic papers. You can have your data, but you can't use it anywhere. By contrast, he said, open banking is actually working well in the UK. EFF's Christoph Schmon added a fourth: the reality that it's "much easier to monetize hate speech than civil discourse online".
Artist Jonas Staal and lawyer Jan Fermon have an intriguing proposal for containing Facebook: collectivize it. In an unfortunately evidence-free mock trial, witnesses argued that it should be neither nationalized nor privately owned nor broken up, but transformed into a space owned and governed by its 2.5 billion users. Fermon found a legal basis in the right to self-determination, "the basis of all other fundamental rights". In reality, given Facebook's wide-ranging social effects, non-users, too, would have to become part-owners. Lawyers love governing things. Most people won't even read the notes from a school board meeting.
Schmon favored finding ways to make it harder to monetize polarization, chiefly through moderation. Jennifer Cobbe, in a panel on algorithm-assisted decision making suggested stifling some types of innovation. "Government should be concerned with general welfare, public good, human rights, equality, and fairness" and adopt technology only where it supports those values. Transparency is only one part of the answer - and it must apply to all parts of systems such as those controlling whether someone stays in jail or is released on parole, not just the final decision making bit.
But the world in which these debates are taking place is also changing, and not just because of the coronavirus. In a panel on intelligence agencies and fundamental rights, for example, MEP Sophie in't Veld (NL) pointed out the difficulties of exercising meaningful oversight when talk begins about increasing cross-border cooperation. In her view, the EU pretends "national security" is outside its interests, but 20 years of legislation offers national security as a justification for bloc-wide action. The result is to leave national authorities to make their own decisions. and "There is little incentive for national authorities to apply safeguards to citizens from other countries." Plus, lacking an EU-wide definition of "national security", member states can claim "national security" for almost any exemption. "The walls between law enforcement and the intelligence agencies are crumbling."
A day later, Petra Molnar put this a different way: "Immigration management technologies are used as an excuse to infringe on people's rights". Molnar works to highlight the use of refugees and asylum-seekers as experimental subjects for news technologies - drones, AI lie detectors, automated facial recognition; meanwhile the technologies are blurring geographical demarcations, pushing the "border" away from its physical manifestation. Conversely, current UK policy moves the "border" into schools, rental offices, and hospitals by requiring for teachers, landlords, and medical personnel to check immigration status.
Edin Omanovic pointed out a contributing factor: "People are concerned about the things they use every day" - like WhatsApp - "but not bulk data interception". Politicians have more to gain by signing off on more powers than from imposing limits - but the narrowness of their definition of "security" means that despite powers, access to technology, and top-class universities, "We've had 100,000 deaths because we were unprepared for the pandemic we knew was coming and possible."
Illustrations: Sophie in't Veld (via Arnfinn Petersen at Wikimedia).
Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.