« Flow, sweet data, flow | Main | Orphans in a storm »

PRISM break

The modern usability movement as it applies to computer software and hardware design began in 1988 when Donald Norman published The Design of Everyday Things. Norman, as he's patiently retold many times since, was inspired to write that book by six frustrating months in England, where he was constantly maddened because nothing, not even light switches, worked logically. His most recent book, Living with Complexity, looked at the design of complex systems, trying to pinpoint how to make the services we navigate every day less frustrating.

I was thinking of this on Wednesday, when the Open Rights Group hosted a meeting on the mid-May Sunday Times story that mobile network operator and ISP EE was sharing detailed customer data with the market survey company Ipsos Mori. EE and Ipsos Mori sent representatives, as did the Information Commissioner's Office. Essentially, they said a small pilot project had been misunderstood.

Privacy is a complicated issue because even experts do not have good answers to questions like how big a risk over what period of time is posed by the disclosure of a particular set of data. We know this much: today's "anonymized" data is tomorrow's reidentified data as more and more datasets come online to help triangulate it, much the way today's strong cryptography will be weaker tomorrow as computational power continues to grow. The ability to make accurate assessments is complicated by unknown externalities. How many users remember what they posted under which terms and conditions five years ago? And users themselves have varying understanding of what they think is happening.

We were into privacy policies and user consent on Wednesday when I began to imagine what these might look like under a more stringent data protection law. It will be like today's omnipresent cookie authorization requests? Click OK to post this data. Click OK to share this data with our partner who just wants to sell you stuff. Click OK to let us reuse this data to personalize the video on the billboard you're about to pass. Click OK to...you mean, you didn't want to send your personal data to the US National Security Agency?

Which is when it occurred to me that we need better mental models of what happens to our data, and we systems designed to match them. Trying to convey this notion was difficult. Angela Sasse has been saying it to security people for 15 years, and what they hear is that users need awareness training. On Wednesday, what the group of people trying to say they have data privacy under control seemed to hear is that users need education and better-written privacy policies or maybe animations! But, as Norman has often written, a user manual - which is what a privacy policy is - is a design failure. What I meant was that if you could build an accurate picture of users' mental models you could then build systems that work the way users think they do so that the internal logic on which users base decisions is correct.

I am not suggesting we fix the users. The users aren't broken. Fix the *systems*.

The problem, someone pointed out to me afterwards, is that a lot of people think that their government knows everything about everyone anyway. But there's a big difference between that casual cynicism and seeing proof. Right on cue, the next day's newspaper headlines. The Guardian and the Washington Post say that under a previously unknown program called PRISM the NSA has direct access to the systems of US-based companies: Facebook, Google, Apple, AOL, Skype, PalTalk, and YouTube. (A number of these companies are quoted denying they have given such access.) Direct access as in, walk right in and pick the data they want. Also: the NSA is collecting the phone records of millions of customers of Verizon, one of the biggest US telcos. And: the UK's GCHQ has had access since 2010.

Worse, US government politicians are defending it: Democratic senators Harry Reid (Nevada) and Dianne Feinstein (California in the Wall Street Journal, President Obama in the Guardian. Charles Arthur has a helpful and rational decoding of all this and Nick Hopkins explains the UK's legal situation with respect to phone records.

At Computers, Privacy, and Data Protection earlier this year, the long-time privacy activist Caspar Bowden discussed the legal and technical framework for surveillance-as-a-service and the risks for EU users of cloud computing (which includes social media sites). Eswsentially, if there is a back door installed in these systems, "interception" is no longer a useful concept, and encryption is no longer a useful defense. Inside those data centers, data is perforce decrypted, and legally authorized direct access to stored uploaded data under the Foreign Intelligence Amendments Act (since the Fourth Amendment does not protect non-US persons) is not interception of communications.

Before the Internet, it was pretty simple to avoid being surveilled by a foreign country: you just didn't go there. So the first thing we need to make explicit in users' mental models is that uploading photographs and personal data to sites like Google and Facebook is digitally entering the US. We could start maybe by requiring large pictures of the services' national flag.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted irregularly to the net.wars Pinboard - or follow on Twitter.


TrackBack URL for this entry:

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)