It strikes me thatt the Oculus Rift is an apt metaphor for the rest of Facebook's business (as well as other large data-driven companies). I understand that the experience is meant to be immersive, but I hadn't appreciated how thoroughly the headset cuts you off from the world around you. In a demonstration mounted for attendees at last Tuesday's Privacy@Scale, only the demonstrator's' voice and guiding hand penetrated from the real world. In eventual gaming and entertainment deployments, presumably users won't have even that. There's little to contrast with the rest of Facebook's business, which would prefer us to stay within its enclave generating data and viewing ads.
Tuesday's event was a general discussion on current issues in technology and privacy. In a conversation about communicating with users, one of the company's representative said matter-of-factly that the company wants the ads it shows to be as valuable to users as the content. "We don't want ads wasting people's time."
It was one of those surreal moments. On the one hand, her statement was perfectly rational: of course they want ads people value - ads that result in clickthroughs and sales generate much greater revenue. On the other...does she inhabit the same world as the rest of us? Did she really mean to say that correctly targeted ads are as meaningful to us as cat videos or pictures of new grandchildren? It's like those airport officials who talk enthusiastically about making their airports destinations in and of themselves. You can have just as much fun at Heathrow as in Hawaii!
I don't want to pick specifically on either her or Facebook - which is why I haven't named her. I recount the incident as an example of why it's so hard to get through to those on the other side of the tracking-and-advertising divide. The rise of ad blockers, which has so many so concerned, is a reflection of that divide - and although most people have multiple motives for using blockers, the escalating size of the installed base tells us that when you give people tools they can actually use they will do so.
The ads-as-engaging-as-content idea of course requires - and for a business justifies - an enormous amount of data collection. This gap is equally hard to bridge. What we want: fair data practices. What they hear: more accessible corporate policies.
It's been obvious for a long time - I wrote about it for the Guardian in 2008 - that privacy policies aren't written to aid consumers but to cover corporate asses and assert control. The same is even more true of terms and conditions, which are spiraling out of control..
The week before last, the Norwegian Consumer Council staged a live-streamed reading of all of the terms and conditions that apply to an average smartphone, which their research showed contains 33 apps. The reading took a shade under 32 hours.
Their point, of course, was to highlight the complete unreasonability of expecting consumers to read all that. Research at Carnegie-Mellon has estimated that reading the privacy policies applying to all the software and services consumers use would cost each of us 76 days a year.
Statements of corporate policy do have their uses, as the lead of that research, Lorrie Cranor<, pointed out on Tuesday. Even if consumers don't read them, journalists, activists, lawyers, and regulators do; therefore, these documents provide a mechanism for holding companies to account. Creating standards for privacy policies that would make them machine-readable, a strategy Cranor favors, would help make it possible to comparison-shop. The FTC's standardized model contract for financial privacy, for example, enabled Cranor's lab to build a demonstration search service for US banks.
At Civicist, mySociety founder Tom Steinberg has a post, The rise of the Internet mitigators, in which he argues that promoters - a category in which he puts those creating technologies and services - will be overshadowed in the coming years by mitigators seeking to regulate in the interests of undoing damage. I'm less sure that's true. On Tuesday, Michelle de Mooy, deputy director of the privacy and data project at the Center for Democracy and Technology suggested that CDT needs to change toward being innovators. "Advocates have to get ahead of the issues," she said. Organizations such as Mydex (ObDisclosure: I do some writing for them) are trying to effect real change by providing the technology infrastructure to up-end the power imbalance inherent in today's data practices. The future may, in other words, be neither mitigators nor promoters but a blend of the two.
Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.