" /> net.wars: June 2014 Archives

« May 2014 | Main

June 27, 2014

There is no Theresa May

"There is no programme of mass surveillance and there is no surveillance state," Home Secretary Theresa May said in a speech this week. She went on to propose the revival of her pet project, the Communications Data Bill, aka the Snooper's Charter. This is the thing where everyone's communications data is stored in a giant shed where it can be searched at will. Seems just yesterday we were declaring it dead.

This in the same week as the Privacy International-inspired revelation that GCHQ classes anything sent to Facebook, Gmail, and the rest are external communications, which makes them fair game for searching under the Regulation of Investigatory Powers Act as long as they don't use British names or residences as the search terms.

May's statement reminds me of Archimedes, Merlyn's owl in T.H. White's The Once and Future King. Hoping to be ignored, upon being introduced to Wart (the boy form of King Arthur), Archimedes closed his eyes and said doubtfully, "There is no owl." Followed almost immediately afterwards by the hopefulness of "There is no boy".

It's tempting for opponents of stuff like NSA and GCHQ spying, data retention, warrantless searches, and wholescale monitoring of data sent to FaGooTwitAp to claim that May is engaging in Orwellian doublethink. But - leaving aside for this week the problem that Orwell's 1984 is becoming dated as a metaphor for our time - I think not.

Do we have a surveillance state? Your perception on this probably depends somewhat on where you live and your socioeconomic class. The reality, however, for most of us is that it's all rather abstract even though we *know* - because of the Snowden revelations, the cameras visible everywhere, the data breaches, the creepily targeted ads, and the news stories about care.data and new technologies like Google Glass - that the potential is there. People on my street do not cower in the dark filled with fear of saying or doing anything that might call attention to themselves. And, judging by the wildly inappropriate halves of phone conversations you hear on out and about and the more extreme ranges of behavior on Twitter and other social networks, neither do most others.

So I'm going to have to say that May is not, like Archimedes, just trying to wish away a situation she dislikes. She is technically correct: we do not have a surveillance state. What we have, and what she would like to continue to build, is the *apparatus* to support a surveillance state. Similarly, she may be technically correct to say we do not have a program of mass surveillance: what we have is many programs that are building capabilities that taken together could underpin such a program. You may call it a distinction without a difference, but you could say the same about trying to decide whether May is ignorant, in denial, or disingenuous. In both cases, in the long run, it doesn't matter because: if the underpinnings are there the switch can be flipped by a Home Secretary willing to do so. And what May's statement is very clear about is that she wants those underpinnings.

In May's insistence on all-data, all-the-time, it's little surprise that the UK is out of step with other parts of Europe, which are beginning to react as they legally must to the European Court of Justice ruling that data retention is disproportionate. Austria has outright dropped it; some Swedish Internet service providers have simply stopped. Doubtless there will be a redrafted directive in due course. (Which raises an interesting question: where would an independent Scotland stand on this issue? Could state surveillance become an issue in the September referendum?)

It's a bigger surprise that even the US might reverse itself, if only slightly, before the UK does. Last week the Supreme Court ruled that police must get a warrant to search mobile phones. My favorite piece of that judgment is Chief Justice Roberts' recognition that a mobile phone, once the police have ensured it can't be used as a physical weapon, didn't require inspection of its stored data to ensure that it was safe.

In an even bigger surprise, this week US Attorney General Eric Holder promised to grant Europeans similar privacy protection as those afforded to US citizens by the Privacy Act. Granted, we await actual legislation to see if and how the Obama administration follows through - without changes to FISA it's hard to see how this is going to work. But the fact that they're even using these words is a major breakthrough. A cynic would suspect that the fact that Holder specifically mentioned the EU suggests that the actual goal is more related to trying to limit the impact of data protection reform on US data-driven companies than to abandoning American exceptionalism.

Unlike Archimedes, therefore, May is, if anything, trying to bring something into existence rather than deny it. I think the logic goes: there is no surveillance state - so we *can* implement all this pervasive data monitoring - and therefore we should, because if we don't terrible things will happen. I guess we were supposed to find it comforting. Me, not so much.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

June 20, 2014

Twin cities

"We will make that person job-ready," Mary Keeling, IBM's manager for economic analysis and smarter cities, promised at last week's Westminster eForum seminar on smart cities, speaking of what could be done with the right big data analytics. I might - only might - have let that pass without notice if I hadn't been, not long ago, at the Cybersalon event on the same topic. There, Usman Haque pointed out the work ethic-oriented rhetoric surrounding IBM's marketing of its "smarter planet". I've heard those ads for years, but noticed the music more than that underlying message.

Haque went on to comment that the technology companies selling smart cities tend to see them as a problem to be solved.

"What makes a city valuable is the unpredictability, diversity, and heterogeneity, all the stuff you don't expect and the people you disagree with," he said. "But that's not what the technology companies are selling you. They are providing systems for convenience, predictability, optimization, security. And that's how they're selling the technology." (As Don Draper might have said, "What's the happiness of solving a problem? The moment before you need more happiness and a new solution.") Haque again: "The same language was used in the 1950s and 1960s to sell highways and high-rises - ways that cities will become more efficient. Now we understand that highways have had untold consequences...and high-rises have had to be knocked down."

Frank Kresin, research director for Amsterdam's Waag Society, similarly said, "Smart cities are not about technology but should be used to empower people. What is smart about a city is the smart people who live there."

Hey, that's *us*!!

So I was newly sensitized by those comments when I heard "job-ready", which, no matter how much the person concerned wants work, doesn't sound like something you do to a person - it sounds like something you do to a machine tool. If smart cities are to be the vibrant, attractive places that proponents suggest, they will have to be more human than that. Which is what Haque was arguing for: the Engaged City. The city where people are enabled to collaborate on shared challenges rather than having their planner-specified problems "solved" by remote black-box algorithms and technologies. The comic version of Haque's concern might be the transport planner in Twenty Twelve, who was convinced he could unsnarl London's traffic if his policies were just applied to a large enough area. The serious version is the project Haque cited which aimed to replace traffic lights with interactive signs on the road. Do you want people staring at the road or looking up and around them for oncoming traffic?

Anyone who's been around information technology systems for any length of time knows that their success depends on their being embraced by their users - and that users will only embrace them if they have been engaged in the development process and the systems offer them genuine benefits. To take as an example Oyster cards: people did not care whether they had a smart card or a paper ticket as long as they could get on a bus or tube quickly. London Underground added a bit of encouragement by making Oyster fares cheaper than the cash equivalent; and people quickly found it convenient not to have to buy a new ticket for every journey. In this time of steady reports of data breaches, the incremental change of dropping Oyster cards in favor of paying directly with bank cards may not receive as wide acceptance among locals - but will likely be loved by visitors.

Technology systems sold by vendors to government agencies and departments typically see the staff on those agencies and departments as the users of the systems they sell. To some extent that's correct: they are the people who have to run the software every day and live with its bugs and quirks. So they are an important group to consider. However, the *actual* users of these systems are all of us: the people who must tangle with DWP to get state benefits and pensions; the people whose children are experimented upon with each new education policy; the residents who do not get asked which hours they need parking to be restricted.

The effect of this way of thinking will be seen especially in its impact on individual privacy, which was not discussed at the Westminster eForum at all except in the context of compliance with data protection law. The forerunner of tomorrow's smart cities are systems like Oyster and like today's network of ANPR cameras, which began with London's congestion charge, all of which could have been designed with privacy and anonymity in mind but which in general are instead maximally invasive. In this context, it's worth recalling that many measurement systems of the past - IQ testing, for example - was intended for use to identify people who needed help but have morphed into systems used to rank everyone - a purpose for which these metrics were never designed. In today's accelerated world, technologies and companies come and go very quickly, and jobs appear and disappear again in response. Big data analytics are only as good as the data you feed them; today's "job-ready" person may be tomorrow's between-stairs maid.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

June 14, 2014

Acts of succession

The drones were pretty against the dimming summer sky. No, not that kind of drones. These belonged harmlessly to members of the DC Area Drone User Group, although the fly-up's organizers warned that some had cameras.

It is the 2014 edition of Computers, Freedom, and Privacy. At this event last year, the same group said these drones were no threat to privacy because they were like "flying lawnmowers". The loud buzzing noise kind of made their point; these drones' biggest threat is probably that they'll inspire reports of UO sightings. I'm sure the DC Area Drone User Group are very nice, ethical guys; it's how their toys will be exploited later that I worry about

Shortly afterwards, the former governor of Pennsylvania and first Secretary of Homeland Security, Tom Ridge, pitched his right to be a privately private person now that he's left public service. Isn't this what we all want? To be private when we want, public when we choose? The irony was lost on no one.

These two elements collided in a fascinating demonstration of the aerial surveillance system that Ross McNutt's company, Persistent Surveillance Systems, provides to cities like Dayton, Ohio to help reduce crime (he also offers surveillance of airports and borders).

McNutt's aerially mounted custom-built cameras collect images too small to identify anybody - a dot per person - or read license plates (the cameras are looking down) but they do show two cars converging on a location, someone getting out of a third car and falling over, and a fourth car waiting elsewhere and meeting them afterwards. The fifth car that drives by turns into a driveway a few blocks away. A witness, maybe?

Trained analysts can step back and forth through time in these pictures, establishing roughly what happened, where the perpetrators and witnesses came from, and where they went. The police are then told where to look and given street-level location pictures sourced from Google Earth. McNutt explained the boundaries: his company is hired by a city; the contract with the city specifies how long to keep the data, often 14 days, almost always less than 45; the data is reviewed only in response to crime reports. The entire cost of the system including aircraft, command center, and the cameras @@ has built using off-the-shelf lenses is less than the price of a single police helicopter. When asked what abilities he'd like to add, McNutt didn't opt for higher resolution or greater intelligence; what he'd prefer is a wider area. His moments of greatest frustration come when a perpetrator vanishes off the edge of the area he's surveilling and he can't follow.

Now, this is a guy who's absolutely trying to do things right: he's consulting CFPers on his privacy policies! As for consent, probably lots of people in these cities welcome anything that reduces crime: a quarter of Dayton households are hit every year. The broader view was summed up the ACLU's Jay Stanley: "Our worst nightmares are coming true." Will the next generation of McNutts be as careful and well-meaning?

Elsewhere everyone is fretting about two things. First is the Freedom Bill, which has been so badly weakened no one is entirely sure whether to support it and hope it's improved or oppose it because what if it's worse than nothing?. Second is counteracting pervasive surveillance. "Technology " is a popular answer, but as many CFPers remember, "technology " was the early 1990s answer, and the crypto wars ended with intelligence agencies undermining the tools we trusted. This is what we've learned in a year: the situation is much worse than we thought. Today's answer - technology, policy/law, and civil action at both local and global levels - is the journey of a million miles that begins with a single, painful step.

And, as Katherine Albrecht commented in the Children and Privacy panel (and Terri Dowty said so often while running Action for the Rights of Children), the next generation is at risk of being conditioned to accept pervasive surveillance. Her particular example is the trend for schools to require students to wear ID tags - and sometimes RFID tags - at all times. Outside of school, says Danah Boyd in her recent book, It's Complicated: The Social Lives of Networked Teens, many teens have little privacy. Constantly monitored by parents and teachers, lockers and rooms subject to search, restricted in where they may go and with whom...online may be the only place they feel free.

Compare and contrast to the truckers studied by Karen Levy. Independent, authority-resistant, culturally macho, for decades they have shown compliance with federal regulations by keeping paper logs. Now, trucking companies are opting for electronic monitoring, and meeting resistance. Many older ones are threatening to quit. But will younger ones recognize the profound cultural shift? Or one day soon will we see an entry in Beloit College's annual Mindset List: "Everyone has always been under total electronic surveillance"?

Even faced with all this, the technologist and science fiction writer Ramez Naam remained optimistic: "The past was worse," he said in his talk. "The culture is changing." Let's hope.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

June 7, 2014

Medic alert

What is medical data?

Probably the image that first pops into everyone's mind is the old folder - or maybe clipboard - holding a lot of paper with test results, charts, and doctors' notes. Fairly quickly that's replaced with the image of today's world as we think we know it: a computer screen showing that same information in a relatively structured way. The most modern of us imagine not a desktop screen on a doctor's desk but a smartphone with that same information. What we're still thinking of, however, is the paper file, however translated into an electronic record.

We are wrong.

This was the lesson of a panel Wednesday morning at the Privacy Health Summit. A lot of things that we don't think of as health data are nonetheless health data.

Start with apps and the information they collect as they count steps, watch your heart rate, remind you to take your pills. Kathryn Serkes tried to keep track of the proliferating health app field but finally stopped counting at 12,000. Sleep apps play white noise and promise an end to insomnia; Sleep Cycle monitors your movements in bed so it can wake you at the most favorable time. Other apps track gym activities or glucose levels, manage eating disorders, provide advice about symptoms. Still others are provided by companies for whom they're marketing tools: Nike training apps, Playtex Period Tracker.

Serkes went on to list the data generated by genealogical research, which may be correlated with genetic information. Inside companies workplace wellness programs encourage employees to disclose health-related information that their employees now can access. Even retail shopping data may be health-related. You've started buying a bigger size; you put on weight - or are pregnant. On top of that is the good, old standby, records of search queries: does looking up diabetes on Webmd mean you're "diabetic-concerned"? Finally, there's the burgeoning category of what Andrew Matwyshyn called "consumer-generated health information", the stuff the quantified self people are tracking and sharing at meetings and conferences, and, of course, online.

The general point was that all of these myriad types of data are health data - but we do not think of them in that light, and they are not regulated as health data.

In the US, the key piece of legislation governing health data is the Health Insurance Portability and Accountability Act (1996). As Nicolas Terry said, both privacy advocates and those regulated by HIPAA have written screeds "eviscerating" it. But: "For thirteen or fourteen years we've all said that HIPAA protects most of our health data most of the time. That reality no longer exists."

The situation is a bit different in countries that, like the UK, have both nationalized health care and data protection laws (at least the first of which, I note in passing, Serkes would certainly campaign against). In the UK no one has yet suggested that care.data might expand to include these newer types of data. For the moment, NHS has little direct access to them. However, this will change soon in all countries. Implanted medical devices already send data to physicians for review, and it's logical that they will extend their oversight to data collected by apps used as part of a treatment regime. Many countries also are talking about ways to use various monitoring technologies to keep elderly and disabled people more safely in their homes for longer. These systems could be built with privacy in mind so that they only summon help when it's needed - but today's mindset seems likely to dictate that they should instead act as constant informants about their charges' activities.

This is less a problem in the EU, where, as Terry pointed out, data protection legislation governs all sectors. In the US those disparate levels already exist: a patchwork covers (some) individual domains, notably finance, health, genetics, and video rental. Unfortunately, "Data likes to be free, and when you have multiple domains data tends to flow to the least-regulated domain."

This all led up to Frank Pasquale, who asked this question: "Do we want to live in a world where people have a body score that's as important and as pervasive as their credit scores? This is the world we're moving into right now."

He listed some examples. The credit card company that found a correlation between getting marriage counseling and defaulting on debts. Result: going for marriage counseling triggers a rise in your interest rate and a lower credit limit. Credit scoring, after all, began as a way of assessing the likelihood that an applicant would default on a loan. Now those scores appear in decisions about health insurance, employment, and other uses they were never designed for. So Pasquale asked: "What happens if we put all the big data databases together. Will it render HIPAA's gains nugatory?"

The clear perimeter that used to delineate health data is vanishing; it is no longer solely collected by experts in formal settings. In countries with solid data protection regimes it may not matter. In the US, the closest comparison seems to me to be Bring Your Own Device in businesses today. Both situations raise the question: do you know where your data is?

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.