« There is no Theresa May | Main | A DRIP in time... »


This has been quite a week for privacy stories. The phone hacking trial finally concluded, imprisoning former News International editor and Downing Street spinmeister Andy Coulson and acquitting Rebekah Brooks. Although: a little of the shine may have been taken off the acquittal by the various comments about the incompetence of an editor who knew so little about what was going on in her newsroom. Tom Watson, MP (Labour, West Bromwich East,), one of the key reasons the trial took place at all, said it best when he commented that Brooks wasn't the editor I thought she was. Newer were Google's first fumblings at implementing the European Court of Justice decision in Google v. Spain and Facebook's incursion into oh-look-we've-got-a-billion-lab-rats.

To take Google first, it was predictable that the first attempts to implement the court's decision would be contentious and there would be errors. And so it's proved: stories are proliferating about news stories being removed from the index after complaints. It would be wise to be careful about rushing to judgment. The media have seized upon complaints; in the Guardian James Ball urges them to fight back and vastly overstates the removal of articles from Google's index as the "memory hole". Get a grip: the articles are still accessible at the original publisher's site. What's different is that they do not appear in response to searches on a particular person's name.

Everyone is assuming that the name in question must be that of the most famous/rich/powerful person n the article, but in at least some cases - such as Robert Peston's column, Merill's Mess - the person requesting removal was in fact an ordinary member of the public wishing for his own appearance in the comments to be banished. The original story is exactly the kind of public interest the court had in mind in trying to balance privacy and freedom of expression - both fundamental rights. Facts are a fine thing.

I think it's logical that Google, trying to make its first decisions about the 70,000 requests it says it's received since the judgment, might make errors. That doesn't obviate the need either for Google to implement a judicial decision it vehemently opposed or for all of us to find a suitable balance between these rights. The biggest danger is really that civil society will be split down the middle between privacy and freedom of expression, rendering both sides ineffectual.

But, oh, Facebook! For a week in early 2012 the service manipulated 700,000 users' newsfeeds to find out whether the positive or negative items they saw affected their emotions as displayed in their own status updates. We know this because the authors, who include Facebook data scientist Adam Kramer, published the results: a small "emotional contagion" effect in both positive and negative cases. Facebook's interest in emotional contagious makes business sense: they wanted to know whether seeing a lot of unhappiness would make people less inclined to visit Facebook (as if the interface weren't sufficient. Facebook has argued that the terms of service gave it permission; unfortunately, as Forbes reports, that clause was added only *after* the data had already been collected.

Two of the most interesting comments on the resulting controversy come from Ed Felten. In his posting on one he argues that the study violates users' privacy by creating knowledge about them that they themselves did not have. In the other he calls the contrasting responses to the news a culture clash between common standards for academic and corporate research. Felten believes -rightly, I think - that we will see many more similar clashes, partly because so much academic research is now funded by corporations seeking answers in their particular areas of interest. This is a fine example of Eli Pariser's filter bubble, in which you can never be sure how what you see on the Internet has been manipulated.

This is what the memory hole really looks like.

As for Facebook's handling of the incident...thanks to the detailed education afforded to me by the brilliant SorryWatch, Sheryl Sandberg's apology on behalf of Facebook is easy to parse.

From PR Week:

She said: "This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated. And for that communication we apologise. We never meant to upset you."

If we assume that Sandberg was speaking in the normal fashion and using "you" to mean the people to whom she was directly speaking at the time, then the people she was apologizing to were the small businesses in New Delhi. And what she was apologizing to them for was not Facebook's using them personally, nor the rest of its billion-odd friends, as research subjects. No, she was apologizing for the way the news came out. This way logically lies continuing to perform and commission such research - but not allowing it to be published, as opposed to what I imagine most Facebook users want, which is not to be manipulated for prospective profit. I have no idea if the small businesses of New Delhi were upset by the journal publication or reassured by the apology

Note also the use of "companies", as in, "All the kids do this. Why are you picking on me?"

It is to ARGH.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.


TrackBack URL for this entry:

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)