" /> net.wars: June 2013 Archives

« May 2013 | Main | July 2013 »

June 28, 2013

Rounding errors

The key purpose of the long-running Computers, Freedom, and Privacy is to look ahead at developing technology and identify future conflicts. This year made earlier years seem startlingly prescient; the angry distrust of government breaking out all over the US since the PRISM revelations mirrors CFP in the mid-1990s, when it was home to the battles for the right to use strong cryptography to protect communications.

Crypto is, of course, also digital cash, which failed then but is now finding some traction in the form of J. Bradley Jansen asked the Bitcoin panel, Is it a currency, a security, or a value transfer mechanism? Patrick Murck, general counsel for the Bitcoin Foundation, had the obviously right answer: "It's an experiment."

You could also call it a "rounding error" in the global economy. Bitcoin: $1.5 billion. Global economy: $50 to $60 trillion. But this term didn't pop up until later, when the topic was drones. This panel showed clearly that drones are at approximately the 1980s stage of personal computers: the ethos of the hobbyist Homebrew Computer Club is clearly echoed by the DC Area Drone User Group.

In the panel I moderated on PRISM, the security consultant Ashkan Soltani pointed out we have in fact four programs to worry about.

But let's go back to Bitcoin: it is not an answer. Under the current design of centralized exchanges and wallets, transactions are traceable. No matter how you parse the authorities' recent actions, even an experiment is going to attract the regulator's eye when it reaches the size and visibility where ordinary consumers might start dropping money into it expecting it to be safe. And most consumers will want it to.

But, as Murck said, regulators are struggling to understand Bitcoin, which doesn't operate like existing financial services: there's no controller and no one to hold responsible for transactions; it's not anonymous though it is private; and transactions are irreversible.

"Regulators think that's unfriendly to consumers," Murck said on Wednesday. "But it has a growing place in ecommerce." Wryly, he described lobbying as the big Bitcoin investment opportunity. "We need a sane regulatory environment for financial technology in general, not just Bitcoin," he said.

Personally, I suspect that regulators are also confused by Bitcoin's arcane nature. A frequent flyer mile is equally abstract but maps mentally to something familiar. Perhaps Bob Newhart, a former accountant, could take on explaining Bitcoin the way he did Sir Walter Raleigh's discovery of tobacco.

Surprisingly, the situation with respect to drones is little different. Matthew Lippincott commented that in the public imagination a drone carries a camera and a gun. In fact, this isn't my image at all. On the sinister side, I immediately see the crop-dusting planes chasing Cary Grant in Alfred Hitchcock's North by Northwest. More benignly, my mental image is more like the small, light, radio-controlled helicopters that Think Geek sells. (You can tell I've led a sheltered life.) The reality, said Timothy Reuter is a "loud, flying lawnmower" that has to have its battery swapped out after 20 minutes.

So there was the question: do we look ahead to when drones are tiny bee-like things or even smaller molecular devices that may surveille us from within after we inhale them, or do we regulate for the clumsy, limited devices we have now?

Benjamin Wittes argued that the two topics - regulating drones and privacy - should be separated for regulatory purposes, arguing that Congress would be a better choice of regulator than the Federal Aviation Authority. "There's a pretty big societal change that is the ability of individuals to spy on other individuals in a way that we're really used to associating with governmental power," he said. "I think drones are a rounding error on that problem." A few minutes later, Lippincott agreed: "Surveillance is a rounding error in the problem of universal cameras and camera access that we're facing."

A regulator's lot is not an easy one. If you regulate a rounding error-sized problem people think you've lost all sense of proportion and are trying to impede innovation and experimentation. If you wait until the problem is a significant size, you're so far behind the technology that either people mock you or they have deliberately pre-empted any actions you might take.

The point is not who was right when - although the ACLU gets points for its 2006 map of the NSA's spying capabilities. The point is which are the right fights to pick. The crypto wars of the mid-1990s. which Matt Blaze reviewed in his keynote, were a necessary but not sufficient battle to win. We - all of us - lost most of the war to preserve the confidentiality of communications. Partly, the issue is usability: people only use crypto when it's invisibly embedded in the infrastructure; if it's visible to consumers its use is too painful - and it's no defense against traffic analysis. The battle we won was a rounding error on the ones we lost. We need to be smarter.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

June 21, 2013

This does not apply to US citizens

"I don't worry about it," said my friend George at the tennis club when I asked him what he thought about PRISM and NSA spying. "I have nothing to hide."

It seems that George's life has been as spotlessly white as his blue-white English legs when it gets warm enough for him to wear shorts. No secret liaisons, no embarrassing moments he'd rather no one knew, no political affiliations to be nervous about...the only thing he could think of that he'd like kept private is his bank statement. I tried telling him that if you review the law books you'll find that most of us commit several felonies a day, but that didn't impress him either. I'm not sure he's ever even gotten a speeding ticket.

It was only after I said to him, "But what about other people?" that he began to look thoughtful. "Oh, yes," he said. "Other people. We must think of them."

The problem with so much of the US reaction to PRISM is that it focuses on what-plays-with-taxpayers. President Obama reassures the American public by saying that all this surveillance isn't about *them*; With respect to Internet and emails, this does not apply to US citizens.

Surveilling foreigners, apparently that's OK.

I had not realized until a couple of weeks ago that the attitude that citizenship confers rights that are not available to non-citizens is peculiar to Americans. Of course not: as an American I just assumed it was widely true. This idea was largely reinforced by experiences with the immigration authorities of other countries, who naturally see the right to enter or reside is theirs to confer or withhold. What I didn't realize was that immigration is a special case. *Human* rights are universal; that's the point.

As the privacy activist Caspar Bowden had to explain to me at ORGCon a few weeks ago, the rest of the world is up against the doctrine of American exceptionalism.

It's extraordinary that a nation of immigrants should be so persistent in viewing "foreigners" as lesser beings. And yet it's clearly a thread that runs through much of American life, from debates about whether illegal immigrants should be allowed to get drivers' licenses to the blanket refusal until 1968, when the Supreme Court stepped in, to allow dual citizenship.

I think the us-and-them attitude gets a boost from the quasi-religious nature of how we're taught to be American, something I only really grasped when I took a second nationality. Under the First Amendment, the nation doesn't do school prayer and there is no established religion. Instead, nationalism takes the place of religion as a pervasive unifying bond. At the private school where I grew up, we said the Pledge of Allegiance to kick off school assemblies and sang "My Country, 'Tis of Thee". If you're trying to pull together a nation out of disparate ethnic and national groups, this makes a lot of sense as a bonding exercise. But the downside is to lessen the bonds with those in other countries who in other circumstances you might otherwise identify with as having common interests and goals. Americans of all stripes are outliers in so many ways: sports (baseball and American football instead of cricket and soccer), religion (American catholics often seem to have their own version of their faith), and so on.

This attitude is exacerbated by the sheer size of the country. If you are sitting in Nebraska, America stretches as far in every direction as the car can drive.

The big challenge for everyone outside the US, therefore, is to get across to them that what happens to foreigners matters in this story. For one thing, technically I it's not really possible to implement mass monitoring of the flow of electronic data that only applies to foreigners, Data packets don't carry passports (or, in the metadata equivalent, have a field for "national origin of creator").

It will be some time before we can separate reporting errors and disinformation from the truth about the technical details of PRISM and the other recent revelations. But the documents published by the Guardian yesterday make it clear that determining whether data relates to US persons or not requires the retention and checking of a lot of ancillary information so it can be used to rule the data in or out. Basically, you're talking a giant database of US citizens. Even then, given the exceptions from the miminization rules for data about US persons, these processes provide little protection to either group. After all, that apparently blameless US person might have been the target of identity theft by one of those...foreigners.

Ultimately, the very fact that the infrastructure for surveillance exists means Americans won't stay safe no matter what guarantees are made now: its use will inexorably spread. Privacy activists call it "function creep". This particular consequence of globalization reminds me of a line from the early 1990s that we used about trying to censor the Internet: that it was like making a rule that you could only pee in the shallow end of the pool.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of the earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.


June 14, 2013

Orphans in a storm

At last week's ORGCon I moderated a panel on orphan works. Specifically, about the recent so-called "Instagram Act", more correctly the Enterprise and Regulatory Reform Act. The panel provided some clarification of this rather contentious bit of new law, courtesy of participants Nick Munn from the UK's Intellectual Property Office (PDF); Emily Goodhand, who is both the copyright and compliance officer at the University of Reading and the vice-chair of the Libraries and Archives Copyright Alliance; the independent barrister and Open Rights Group advisory council member Francis Davey; and photographer and security consultant Daniel Cuthbert.

Orphan work may mean anything from an unpublished 16th century poem, which under UK law is under perpetual copyright until or unless someone with the legal right to do so authorizes publication, to a photograph taken yesterday, stripped of its identifying metadata, and posted anonymously on Tumblr without a license explicitly permitting reuse. Legally speaking, reusing these orphan works is off-limits. Not that this ever stopped anyone from posting now, worrying later.

The big difficulty is this: there are billions of orphan works, and millions more are being created every day. Dropping the registration and renewal requirements, as the US did in 1976, ended the documentation of copyright owners. Also a factor, Facebook and some other social networks deliberately strip photographs of their metadata. Photographers are angry about this, but there are legitimate security reasons to remove the geotagging that might lead the creepy adult to the door of the cute, little blonde child.

Some percentage of those orphans are works whose creators are known but whose owners are still a mystery because the publishers that demanded all-rights contracts have gone bust, merged, had their assets bought and resold, or generally vanished.

There are all sorts of reasons why this situation needs to change. For one thing, libraries and archives can't publish or copy, even for preservation, the centuries-old unpublished works they hold. Hordes of photographs can't legally be copied but that, posted online to a Galaxy Zoo-style site, could be identified if enough people looked at them. Finally, an orphan work that no one can access is no good to anyone.

The big fly in this particular slab of frozen amber is photography. Text can be easily searched. Film and music can be compared to other recordings by the same individuals or of the same material. But how easy is it to indisputably distinguish my photograph of the pagoda in Kew Gardens from all the others taken under similar weather and lighting conditions? Professional photographers complain that even commercial news organizations that know better use found images and offer minimal payment only if challenged. Recent years have seen real problems with abusive licensing demands from the big electronic picture agencies, but this is different: it's simple theft. (A big help here should be the October 2012 addition of a small claims court copyright track; the limit is being raised to £10,000.)

To avoid legalizing this kind of theft, the plan, derived from laws in other countries such as Canada and Hungary, is to require a "diligent search" for the owner before an authority grants a license to use the work. To deter people from cheating, there's also some thought of requiring licensing fees up front. The exact definition of "diligent search", how a "market rate" for licenses might be calculated, and who gets to hold the money and use the unclaimed portion are the really contentious elements.

A couple of audience members raised two additional issues. First, whether those pursuing licenses to use orphan works will be allowed to pay to accelerate the process, as is apparently the case in Japan. Second, whether computer programs (and games) will be included. The latter is something the IPO hadn't considered - and it's important, not just because people are nostalgic forthe favorite computer games of their childhood, but because both businesses and individuals may have years' worth of work locked up in software that goes out of development.

Museums and archives have pointed out that paying up front for the millions of works in their collections is prohibitively expensive, no matter how small you make the fee, which is mostly going to come out of taxpayers' money. Even at a fixed fee per year, it makes no sense to rob museums to pay collection societies. This problem seems easily fixed by exempting these institutions, perhaps by allowing them to digitize now, pay later. Commercial organizations, which should pay market rate, have a different problem: "market rate" is infinitely flexible and varies considerably over time. A photograph of a ten-year-old staring intently at a drain is worth nothing except to its family - until that kid grows up to be a Nobel Prize-winning physicist.

Of some concern, as Glyn Moody noted, is that the group charged with proposing answers to these questions is almost entirely composed of copyright organizations and entirely lacks representatives of the general public, who are also creators and therefore stakeholders. At the panel, Munn said that it's just a starting point after which there will be a public consultation on the proposed rules. Keep watching.

Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted irregularly during the week at the net.wars Pinboard - or follow on Twitter.

June 7, 2013

PRISM break

The modern usability movement as it applies to computer software and hardware design began in 1988 when Donald Norman published The Design of Everyday Things. Norman, as he's patiently retold many times since, was inspired to write that book by six frustrating months in England, where he was constantly maddened because nothing, not even light switches, worked logically. His most recent book, Living with Complexity, looked at the design of complex systems, trying to pinpoint how to make the services we navigate every day less frustrating.

I was thinking of this on Wednesday, when the Open Rights Group hosted a meeting on the mid-May Sunday Times story that mobile network operator and ISP EE was sharing detailed customer data with the market survey company Ipsos Mori. EE and Ipsos Mori sent representatives, as did the Information Commissioner's Office. Essentially, they said a small pilot project had been misunderstood.

Privacy is a complicated issue because even experts do not have good answers to questions like how big a risk over what period of time is posed by the disclosure of a particular set of data. We know this much: today's "anonymized" data is tomorrow's reidentified data as more and more datasets come online to help triangulate it, much the way today's strong cryptography will be weaker tomorrow as computational power continues to grow. The ability to make accurate assessments is complicated by unknown externalities. How many users remember what they posted under which terms and conditions five years ago? And users themselves have varying understanding of what they think is happening.

We were into privacy policies and user consent on Wednesday when I began to imagine what these might look like under a more stringent data protection law. It will be like today's omnipresent cookie authorization requests? Click OK to post this data. Click OK to share this data with our partner who just wants to sell you stuff. Click OK to let us reuse this data to personalize the video on the billboard you're about to pass. Click OK to...you mean, you didn't want to send your personal data to the US National Security Agency?

Which is when it occurred to me that we need better mental models of what happens to our data, and we systems designed to match them. Trying to convey this notion was difficult. Angela Sasse has been saying it to security people for 15 years, and what they hear is that users need awareness training. On Wednesday, what the group of people trying to say they have data privacy under control seemed to hear is that users need education and better-written privacy policies or maybe animations! But, as Norman has often written, a user manual - which is what a privacy policy is - is a design failure. What I meant was that if you could build an accurate picture of users' mental models you could then build systems that work the way users think they do so that the internal logic on which users base decisions is correct.

I am not suggesting we fix the users. The users aren't broken. Fix the *systems*.

The problem, someone pointed out to me afterwards, is that a lot of people think that their government knows everything about everyone anyway. But there's a big difference between that casual cynicism and seeing proof. Right on cue, the next day's newspaper headlines. The Guardian and the Washington Post say that under a previously unknown program called PRISM the NSA has direct access to the systems of US-based companies: Facebook, Google, Apple, AOL, Skype, PalTalk, and YouTube. (A number of these companies are quoted denying they have given such access.) Direct access as in, walk right in and pick the data they want. Also: the NSA is collecting the phone records of millions of customers of Verizon, one of the biggest US telcos. And: the UK's GCHQ has had access since 2010.

Worse, US government politicians are defending it: Democratic senators Harry Reid (Nevada) and Dianne Feinstein (California in the Wall Street Journal, President Obama in the Guardian. Charles Arthur has a helpful and rational decoding of all this and Nick Hopkins explains the UK's legal situation with respect to phone records.

At Computers, Privacy, and Data Protection earlier this year, the long-time privacy activist Caspar Bowden discussed the legal and technical framework for surveillance-as-a-service and the risks for EU users of cloud computing (which includes social media sites). Eswsentially, if there is a back door installed in these systems, "interception" is no longer a useful concept, and encryption is no longer a useful defense. Inside those data centers, data is perforce decrypted, and legally authorized direct access to stored uploaded data under the Foreign Intelligence Amendments Act (since the Fourth Amendment does not protect non-US persons) is not interception of communications.

Before the Internet, it was pretty simple to avoid being surveilled by a foreign country: you just didn't go there. So the first thing we need to make explicit in users' mental models is that uploading photographs and personal data to sites like Google and Facebook is digitally entering the US. We could start maybe by requiring large pictures of the services' national flag.


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted irregularly to the net.wars Pinboard - or follow on Twitter.