Only forget
The so-called "right to be forgotten", proposed as part of the EU's data protection reform package, is a particularly contentious idea. Located at the nexus where privacy, freedom of expression, and freedom of association collide, it impinges on other rights we care about in all directions.
At a Westminster eForum on the data protection reforms (PDF) a couple of weeks ago, the deputy commissioner David Smith, from Britain's Information Commissioner's Office, commented that he thinks RTBF "may have unintended serious impact by setting users' expectations too high for what can realistically be achieved". Today, media stories are reporting that the UK wants to opt out, for just those reasons. As the Telegraph explains matters, currently individuals have the right to object to wrong information; what's proposed is to extend that right and put the onus on companies to comply. At stake for non-compliance (with the various new rules, not just RTBF): up to 2 percent of a company's global income.
Smith thought it would be better to characterize it as a "right to object", and thought seen in that light it could be a good thing: "I can say I object and ask you to stop, and you will have to stop unless you can come up with valid reasons." Such a right would, he said, reverse the burden of proof in existing processes, rather like shifting from opt-out to opt-in. This seems reasonable to me, in part because no matter what the law says it's impossible to be sure that every copy of anything has been removed from the Internet.
Framed Smith's way, the proposals seem more rational: there is real harm at stake here. For example, why should someone applying for a job be at a disadvantage because the first few pages that turn up in a Google search are full of abusive postings over something you said once about Harry Potter? Whether or not you did anything wrong, today's cover-your-corporate-ass attitudes will move on to a less controversial candidate. It can take years to push that kind of thing down enough pages that Human Resources will lose patience before finding it. Similarly, with no obligation to check, update, or correct information, sites are free to spread as much misinformation as they like. These are issues that have been well covered by privacy scholars such as Daniel Solove and, in his 2010 book, Delete, Viktor Mayer-Schönberger.
Anyone with a modicum of technical knowledge about the Internet will say it's not reasonable to expect ISPs, search engines, hosting sites, and social networks to police the steady flood of data that's being posted. Granted. Much impetus behind RTBF came from the discovery, largely due to research by Cambridge's Joseph Bonneau, that deleting material from social networks is often purely cosmetic: the material is not served up to the public but continues to reside on their servers. Bonneau published his research in 2009; it was August 2012 before Facebook changed the system so that deleted means deleted. The hunger of (especially) advertising-supported companies for Big Data means there is a genuine need for this intuitively correct right to be backed by law. The alternative risks rampant deception, only revealed occasionally when researchers like Bonneau are motivated to dig and publish.
Other difficulties concern the nature of social networks. If a circle of friends regularly post pictures of themselves and each other, large holes are torn in the social fabric if one demands that all material related to them be deleted. What are the boundaries of RTBF in a time when people, like networks, are losing their defined perimeters?
Naturally, the biggest push-back against RTBF is coming from the US, home of the biggest advertising-driven companies. In the Stanford Law Review, Jeffrey Rosen analyzes the legal roots of this cultural disjuncture, tracing it to variations in the way past convictions are handled in criminal law. In the US, convictions are published while in Europe after a certain amount of time has passed they are "spent" - essentially, forgotten. In 2009, Wikipedia was the center of exactly this cultural clash.
Like Lauren Weinstein, who sees RTBF as a modern reenactment of the memory hole in Orwell's 1984, Rosen concludes that RTBF necessarily means chilling freedom of expression and implementing mass censorship. Peter Fleischer, Google's global privacy counsel, doesn't love it either: by January 2012 he was complaining that, their expectations raised, individuals were asking for links to legally posted information about them to be deleted from search listings, perhaps referring to the workings of a relevant Spanish law. Fleischer and the EU went on to argue about whether search engines should be responsible for such deletions or not. Last week, a US coalition launched that claims it wants to find a balance between protecting privacy and ensuring the free flow of information.
The labyrinthine process of legislation in the EU makes it hard to parse the thousands of amendments and myriad committee votes to get a sense of how this particular debate will translate into law. You would hope that the length and complexity of the process would result in a nuanced outcome.
Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Stories about the border wars between cyberspace and real life are posted throughout the week at the net.wars Pinboard - or follow on Twitter.