" /> net.wars: February 2014 Archives

« January 2014 | Main | March 2014 »

February 28, 2014

Body mechanics

Part of the mythology surrounding the former tennis player John McEnroe is that he never practiced. The reality, McEnroe wrote in his autobiography, was that he worked out and practiced much more than anyone acknowledged - and that even if he did take it physically easier than some of his peers he spent an enormous amount of time *thinking* about the game. Mental practice is still practice.

This is an extension people's desire for athletes be pure of motive and natural of body. They can work physically beyond all rationality with the world's most expensive experts but they aren't supposed to gain an "artificial" edge. Remediation, however, is OK. Within the rules, Andre Agassi could take cortisone injections to ease his chronic back pain, but trace amounts of nikethamide got Marin Cilic banned. Maybe it's the "yuck factor": who wants to think of their sporting hero lying on a gurney, sweaty butt pointed upwards awaiting injection with questionable substances in a seedy veterinary clinic?

Last night's Cybersalon on human enhancement carried the remediation versus enhancement theme into hardware for the rest of us, exploring some truly strange possibilities.

It's impossible not to cheer the amputees who, like Veronika Pete and Nigel Ackland, have been fitted with prosthetics with superhuman characteristics. Pete's customized prosthetic leg lights up and has a secret compartment (she's not quite sure yet what she intends to store in it, but do *you* have one?). Ackland has a bionic hand that, he said, has turned him from an object to pity to one of envy. Only a curmudgeon could object to these.

The same is true of two more of last night's presentations, which edged further into creative enhancement. Science writer Frank Swain was only 25 when he realized he was going deaf. The beige hearing aids he received made him wonder: given that the aids do extensive processing to turn the noise they capture into what they think you would normally hear, why couldn't he hack their code to produce something rather more interesting than reality? The BBC radio documentary Hack My Hearing was the result. Swain has big plans. He is, he said, "on the cusp" of being able to hear wifi.

"The idea is not to simply be sensitive to it," he said (or more or less). "It's more to have it like traffic noise and be aware of invisible environment. If you don't perceive those systems they will be able to control you." And, he added, given an auditory connection to your phone, why shouldn't you be able to listen to the stream of information about you and your environment coming over the Internet? I guess it's an auditory version of Google Glass, and while I think augmented reality would be more efficient via audio, keeping your hands and eyes free, the idea makes me long for silence.

Like Swain, Neil Harbisson's implanted periscope/antenna color sensing device has remediation as its primary purpose: Harbisson was born completely color-blind. Granted, he's come a long way since the first iteration ten years ago, when he began experimenting with transposing the different frequency vibrations characteristic of colors of light into sounds. The current model of the sensor, which works through bone conduction, transmits ultraviolet and infrared as well as the spectrum normally visible to humans. One consequence: he's taken to composing meals and selecting clothing based on how they sing (he was wearing orange, bright blue, and hot pink but failed to provide audio).

These folks are all making the best of a bad situation (YouTube).

At the enhancement extreme, however, lies Rachel Armstrong's work helping - or enabling - Stelarc and Orlan. Armstrong got used to "rewiring" the body to remediate failing functionality while working in a leprosy colony in India. Once returned to the UK, lacking similar opportunities to "study the cultural, social, and aesthetic effects of being alternatively bodied", she turned to working with performance artists. She spent seven years helping Stelarc develop an extra ear, which he eventually had implanted on the underside of his left forearm. Orlan I remember vividly from a 1994 video presentation of her surgeries at an ICA conference. Some of her fellow speakers turned green; distressed audience chatter filled the unfortunately scheduled lunch immediately following. Armstrong, like Orlan herself, sees a fabulous exploration of the edges of physical transformation. I see something that looks more like mental illness than art.

The two theorists on the panel distinguished, like anti-doping organizations, between remediation and enhancement. Steve Fuller, author of the 2011 book Humanity 2.0, noted that the problem with enhancement arrives in stages. First it's rare and confers an advantage; then it's normal and society reorganizes around it; finally, those without are left behind. By contrast, Dave King, founder of Luddites 200, to be the lone dissenter. He is, he said, not anti-technology but "anti-technocracy". Enhancement, he argued is hurtful to community because it values competition to be the best.

Not being a theorist, to me there's a practical reason to be cautious: defying biology is a risky business. I'll stick to original equipment as long as I can - and keep the enhancements external, thanks.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Stories about the border wars between cyberspace and real life are posted throughout the week at the net.wars Pinboard - or follow on Twitter.


February 21, 2014

A problem shared

In a surprise turnaround, this week NHS England announced that the giant care.data database will be delayed for six months while everyone rethinks. Private Eye, in analyzing why people are so upset about what it calls an "eminently sensible" plan to use patient data to improve the quality of medical care generally and the efficacy and efficiency of the NHS specifically, hit pretty much all the right nails accurately. The government opted us all in by default, assumed that we all trusted it to manage a giant, high-value database without troubles, offered little granularity of choice or individual control. Essentially it planned to write our collective signatures on a blank check (or cheque). Small wonder there were objections, quite apart from the fact that a lot of people didn't get the leaflet propagandizing the system.

The objections are handily explained by Ross Anderson (see also here), who includes a handy link to the MedConfidential pages explaining how to opt out. To round things out, the BBC reports on Patients4Data's attack on critics, and ComputerWeekly says the delay was forced by the Information Commissioner and privacy law.

The money quote from Private Eye: "It would have been more sensible to give patients easy access to their own data first so they can decide whether they want to pass it on." Yes. That.

Coincidentally, this week, on a panel I appeared on at Sarah Lawson described the instructions she gives fellow staff at the University of Oxford, where she is the security wrangler for a group doing clinical trials: "Imagine each piece of data as a picture of yourself naked." Yep, I can see exactly how that would work. It's a brilliantly constructed mental model to create a culture of careful data practices, and the government should take note.

It's clear that many people do want to share their data. It's also clear that many people don't. Some support for this turned up at last night's meeting of Quantified Self, where the ODI's Ulrich Atz presented the first results of a survey of self-hackers' practices and goals. Ninety percent of the 105 respondents were willing to consider sharing their data in a good cause, although at least one imposed a notable limitation: "Yes - after my death."

There are a lot of reasons why people might want to share their medical data. Anyone with a severely ill child, for example, would share all the medical details if they thought there was even the slightest chance doing so would produce a second's less suffering or a marginally better outcome. One of the earliest cases of this I'm aware of was in the early 1990s on the WELL, where member Phil Catalfo posted everything he could about his son's leukemia; other members with medical backgrounds helped the family understand what the doctors were telling them. There were other such cases in those early days; I seem to recall a Chinese cellist whose ailment was diagnosed by far-off doctors reading the publicly posted records, for example.

In these cases, though, the benefit to the patient was clear and the patient (or his trusted representative). Similarly, I think most people would agree with Ben Goldacre that patient data on a national scale provides an extraordinary opportunity to improve medical treatments for all. One of Goldacre's three points of advice for rescuing care.data is to show people the good that exploiting this data can do.

But Goldacre stops there: his other two points are that the government should come back with a clearer story and a set of unambiguous rules and principles for how the data will be shared (and to stop lying about the data being anonymized and therefore untraceable to individual patients), and that the penalties for breaking these rules should be draconian. Switching to an opt-in regime would, he says, destroy the data, as would too many people opting out.

I don't think that's going to work, and for a simple reason: governments change. Data protection law is supposed to block data collected for one purpose from being used for another - but data protection law only applies when governments haven't passed laws saying it doesn't. Even if this government comes back in six months, suitably chastened, and lays out an acceptable program of data sharing and waterboarding for violators, there is nothing to stop the government of five years from now from changing the law to vastly expand the program. Once they have the data, we have lost control. No matter how polite they get about it, they are asking for that signature on a blank check. And medical data, unlike most other sorts, is uniquely revealing not only about ourselves but about our relatives.

And that's why Private Eye's view is the right one. I have no idea what the data my GP has about me looks like or what would be uploaded. Does it include notes? Does it include the questions I've asked? Are there errors? Is there information that invades the medical privacy of my siblings, who do not live in the UK? Show me the data and who you want to give or sell it to. Then we'll talk.


Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.


A problem shared

In a surprise turnaround, this week NHS England announced that the giant care.data database will be delayed for six months while everyone rethinks. Private Eye, in analyzing why people are so upset about what it calls an "eminently sensible" plan to use patient data to improve the quality of medical care generally and the efficacy and efficiency of the NHS specifically, hit pretty much all the right nails accurately. The government opted us all in by default, assumed that we all trusted it to manage a giant, high-value database without troubles, offered little granularity of choice or individual control. Essentially it planned to write our collective signatures on a blank check (or cheque). Small wonder there were objections, quite apart from the fact that a lot of people didn't get the leaflet propagandizing the system.

The objections are handily explained by Ross Anderson (see also here), who includes a handy link to the MedConfidential pages explaining how to opt out. To round things out, the BBC reports on Patients4Data's attack on critics, and ComputerWeekly says the delay was forced by the Information Commissioner and privacy law.

The money quote from Private Eye: "It would have been more sensible to give patients easy access to their own data first so they can decide whether they want to pass it on." Yes. That.

Coincidentally, this week, on a panel I appeared on at Sarah Lawson described the instructions she gives fellow staff at the University of Oxford, where she is the security wrangler for a group doing clinical trials: "Imagine each piece of data as a picture of yourself naked." Yep, I can see exactly how that would work. It's a brilliantly constructed mental model to create a culture of careful data practices, and the government should take note.

It's clear that many people do want to share their data. It's also clear that many people don't. Some support for this turned up at last night's meeting of Quantified Self, where the ODI's Ulrich Atz presented the first results of a survey of self-hackers' practices and goals. Ninety percent of the 105 respondents were willing to consider sharing their data in a good cause, although at least one imposed a notable limitation: "Yes - after my death."

There are a lot of reasons why people might want to share their medical data. Anyone with a severely ill child, for example, would share all the medical details if they thought there was even the slightest chance doing so would produce a second's less suffering or a marginally better outcome. One of the earliest cases of this I'm aware of was in the early 1990s on the WELL, where member Phil Catalfo posted everything he could about his son's leukemia; other members with medical backgrounds helped the family understand what the doctors were telling them. There were other such cases in those early days; I seem to recall a Chinese cellist whose ailment was diagnosed by far-off doctors reading the publicly posted records, for example.

In these cases, though, the benefit to the patient was clear and the patient (or his trusted representative). Similarly, I think most people would agree with Ben Goldacre that patient data on a national scale provides an extraordinary opportunity to improve medical treatments for all. One of Goldacre's three points of advice for rescuing care.data is to show people the good that exploiting this data can do.

But Goldacre stops there: his other two points are that the government should come back with a clearer story and a set of unambiguous rules and principles for how the data will be shared (and to stop lying about the data being anonymized and therefore untraceable to individual patients), and that the penalties for breaking these rules should be draconian. Switching to an opt-in regime would, he says, destroy the data, as would too many people opting out.

I don't think that's going to work, and for a simple reason: governments change. Data protection law is supposed to block data collected for one purpose from being used for another - but data protection law only applies when governments haven't passed laws saying it doesn't. Even if this government comes back in six months, suitably chastened, and lays out an acceptable program of data sharing and waterboarding for violators, there is nothing to stop the government of five years from now from changing the law to vastly expand the program. Once they have the data, we have lost control. No matter how polite they get about it, they are asking for that signature on a blank check. And medical data, unlike most other sorts, is uniquely revealing not only about ourselves but about our relatives.

And that's why Private Eye's view is the right one: we. I have no idea what the data my GP has about me looks like or what would be uploaded. Does it include notes? Does it include the questions I've asked? Are there errors? Is there information that invades the medical privacy of my siblings, who do not live in the UK? Show me the data and who you want to give or sell it to. Then we'll talk.


Wendy M. Grossman Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.


February 14, 2014

Slip-sliding away

Yesterday's UK Internet Governance Forum meeting was a masterful demonstration of why people outside government, especially techies, get terminally frustrated with the processes of governance. It was meandering and slow, with no particular goal in sight other than to make it to future meetings, where presumably the same issues would be discussed again. There are many of these later meetings, and they all sound important. The Internet Corporation for Assigned Names and Numbers, which actually does govern, or at least run, the domain name system and allocates the numbered Internet protocol addresses, will hold its June meeting in London. There will be a summit in Brazil in April, an ITU meeting in October, all leading up eventually to the full Internet Governance Forum in Istanbul in September.

That's a lot of wrangling over a question that certainly matters - how should the Internet be governed, by whom, and with what accountability? - but that is happening in a post-Snowden technical vacuum. The congregations that will determine how the Internet functions technically are happening elsewhere, such as the March Internet Engineering Task Force meeting, also in London, as engineers and security experts meet to try to figure out how to harden the Internet against endemic multi-lateral spying. Other than a few BCS representatives discussing surveillance, the technical community was not represented. The need for their presence was noted by the Foreign and Commonwealth Office director of international cyber policy, Jamie Saunders, who called the organizations beginning with "Internet" that do the hard technical work: IETF, IAB, ISOC, and so on, the "i-stars".

For the rest...as Nigel Hickson, now ICANN's vice-president for Europe and a veteran of the early 1990s crypto wars, commented, the drive to make ICANN less cosily attached to the US Department of Commerce long pre-dates Snowden. So does the dispute between countries that support a multi-stakeholder model and those that want a top-down ITU-style governance model (I trace this discussion to at least 1997). Snowden or no Snowden, he said, it would have resurfaced this year as part of the WSIS+10 review and other upcoming ITU events. Snowden's revelations have strengthened the existing feeling that the US cannot go on being the most special snowflake in terms of Internet governance. As much as many Americans seem to resent that (reader email has frequently said "We invented it, why should we give it away?"), I agree with Hickson that widening ICANN's accountability was always inevitable and even essential.

The problem is that while this group is fiddling - "Why do we need Internet principles?" asked the moderator seriously to start off a session on what those principles should be - the chance that what they decide will actually matter is slipping away. Some examples, from recent weeks' headlines.

Item: In the US, Comcast is buying Time-Warner cable. Comcast claims that this is really not a bad thing because it has all this competition - Google Fiber, Hulu, Netflix. Consumerist has a nice round-up of the objections. The problem is not just that the merger creates a nationwide near-monopoly on cable provision (after all, the UK's Virgin Media is in a similar position) but that in most areas there is only one other broadband provider (where in most of the UK there are five major ones, and often hundreds of smaller ones). This doubly matters because...

Item: In 20 US states ISPs and cable companies have successfully campaigned to get restrictions on municipal wifi on the books. This seems to me no different than General Motors and Chevron teaming up in the 1930s to buy up and shut down more than 200 municipal streetcar systems in order to force a reluctant public to buy cars. This is where a giant Comcast is a particular threat.

Item: AT&T's "bright idea" of sponsored data is spreading to the UK, where Techdirt reports that Vodafone wants to try it too. As previously noted, this is a spectacularly bad idea that allows yesterday's innovators, who took advantage of equal access to build big, established companies, to pull up the ladder behind them and lock out newcomers.

Item: Yahoo! Germany has published new terms and conditions that refer German users to Irish data protection law in case of disputes. Software and services companies are long accustomed to dictating the jurisdiction that will apply (usually California). So Germans are supposed to learn English to complain about how their data is being handled now?

Item: Nominet, the UK domain name registry, which sponsored this IGF meeting, is itself preparing to favor businesses over charities as it brings in second-level .uk registrations. As Simon Phipps writes, this is simply not fair. It's certainly not how we want wider Internet governance issues to be decided. That's why we have Internet principles: so we have a way of gauging whether the structures we have and are building are having the impact we intended. Two of those principles are fairness and equal access.

All of these moves tend to concentrate power over access to and/or content on the Internet. Yet that was the big subject that never came up yesterday: the concentration of power in the hands of a small number of private-sector players and the need to decentralize. By the end of all these meetings, there may be a governance structure - and nothing left to govern.


Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.


February 7, 2014

Original sin

Is there a prayer that says, "Oh, Lord, protect us from giddy optimists?" If not, maybe there should be.

Granted, optimists, as the veteran screenwriter Earl Pomerantz wrote recently, are responsible for much of the progress in the world. (Pessimists are too busy coming up with the reasons why it - whatever it is - will not work.) But *giddy* optimism is the kind of thing that gets people imagining that if we can get this bit of new technology to work it will be a great force for democracy, fairness, and social justice.

Last week, as one of the Open Data Institute's Friday lunchtime series, Alan Patrick poured a few helpful dark side thoughts onto open data.

Patrick has company. In Seeing like a geek, Tom Slee writes that open data enables the replacement of small gatekeepers with fewer, larger ones, devalues "informal knowledge", and widens existing inequalities by empowering the already empowered. Both Patrick and Slee cite the same study that shows that They Work For You, the much-praised MySociety site, is disproportionately used by males, college-educated, and over-54s, the very people who find it easiest to contact their MPs in other ways. O'Reilly Radar has a rebuttal by Mike Loukides, which argues that while open data is not an unalloyed good, private data is a public bad. The problem, as I wrote last year in response to a talk given by Bill Thompson, is that there's no way to open data only to the good guys with pure motives without making it closed.

The good thing about these nay-sayers is that they're speaking up so early in the history of open data. Patrick himself compared the current state of open data to the early days of the Internet itself. At the beginning, he said, there was an assumption that the "bad guys" would always be on the outside and that the Internet would only be used for good. Patrick has a sufficiently long history at places like BT and the BBC to be able to lay reasonable claim to remember the Internet's origins, but I'm still not sure he's right. When the pioneers talk, such as at last summer's Internet history event, the stories they tell are about just trying to get stuff to work.

Patrick is right, however, that at the moment the most common complaint about open data is that there isn't enough of it opening up fast enough. He cited, among other things, a 2012 study from the University of Albany, NY (PDF) that advised taking time to think about consequences and sustainability. You can see their point about the latter: at CDPD a few weeks ago, Meg Ambrose outlined her study showing that only 10 to 15 percent of Web content lasts online as long as a year. We call that vanishing content "link rot" or "bit rot"; DuckDuckGo founder Gabriel Weinberg has talked about "API rot"; next will be "dataset rot".

Consequences are of necessity harder to guess at. One of the points Patrick cited from the Albany study was the need to understand the practices that created a given dataset, especially since most were not created with public reuse in mind. Worst, of course, is the case he cited from the UK, in which the Secretary of State for Health, Jeremy Hunt, conflated open data with personal data with the idea of offering patient data to private companies for research. We've written enough here about the privacy aspects of this sort of thing; Patrick's larger point was that this textbook case of how not to approach open data is asymmetric. That is, the people bearing the risks and who will suffer any damage this is caused are not the beneficiaries. "The benefits will be private, the losses public," Patrick said.

And won't hackers have fun matching black market data with the new open stuff? Here Patrick made a point I've long thought about: "This is a read/write game." In other words, what we have to fear is not just that criminals will be enabled to mount far more sophisticated spear phishing attacks but that they can literally poison the information supply. In one case Patrick cited, crime reporting, some areas stopped reporting crimes because opening the data made them fear their property values would sink.

When Patrick started talking about collateral damage, I began wondering about an analogy Michael Froomkin was mulling over a year or two back between privacy and environmental protection. As in the case of pollution, any damage caused by open data is likely to be cumulative and take place far downstream, possibly many years in the future. There's no obvious way to plan for this, any more than the guy making the first refrigerator could have foreseen the hole in the ozone layer over Australia.

Patrick is certainly right that the time to think about these issues is now, at the beginning. But there's a downside to the downside: there is a lot of data we really do want opened. Sometimes the most important thing at the beginning is to make a start before the pessimists talk you out of the whole thing.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.