« Hold the fireworks | Main | Tripartite »

The secret adversary

"I always tell my students to write software assuming there's an adversary," jms.jpgthe University of Pennsylvania professor Jonathan M. Smith said to me this week. That may not sound like much at first glance, but at work it's an exceptionally powerful principle.

It isn't, of course, how we normally think. Most of us are pretty trusting of other people most of the time. We walk on city streets and wait on railway platforms generally confident that none of the people around us is likely to suddenly decide to push us into the path of oncoming vehicles. In fact, many people are now so trusting about the lack of threats on the street that they willingly block one of their most essential senses for spotting danger - hearing - while diverting all their visual attention to their phones. Evolutionarily speaking, they're great prospects for the Darwin Awards. Similarly, all of the drivers piloting tons of machinery at high speeds on highways assume that none of their fellow road users is suddenly going to turn their vehicle into a weapon of motorway destruction. Ove time, as these things don't happen, we become increasingly secure in our assessment of the situation.

When you apply this habit of mind to today's big issues of surveillance and monitoring, the fact that so far most people haven't suffered much beyond unwanted advertising makes many of us complacent - unfortunately so. In a lengthy talk, The Moral Character of Cryptography last December, the UC Davis computer scientist Philllip Rogaway noted the gap between academic cryptographers' idea of cryptography as a series of challenging but neutral mathematical puzzles and activists' and governments' assessment of it as an important political tool. While discussing this problem, Rogaway notes that "History teaches us that extensive government surveillance becomes political in character." In other words, no matter the intentions of today's large companies and governments in collecting so much data about each of us, the eventual outcome can be confidently predicted.

Rogaway argues in favor of applying personal values to such decisions as what to work on and whom to take funding from. He also recommends avoiding the now-common idea that the problem of mass data collection can be solved by controlling access and use. The mere fact of the collection turns people into shallow conformists (he writes), and chills political action, whether or not the data is seen to be used.

Which brings me back to Smith's principle. In comments I submitted to the committee considering the draft investigatory powers bill, I focused on the "hacker's charter" portion of the bill - that is, the part that would grant GCHQ the right to perform "bulk equipment interference". The bill talks about "computers" and "smartphones" without ever making plain that a computer is not what most of today's Parliamentarians probably think it is. A car is 70 computers on a wheelbase; even lightbulbs are computers now. Thumbnail image for Hello-Barbie.jpgSo the bill's proposals grant GCHQ and potentially the police far greater and more wide-ranging power than I suspect most MPs would imagine: not just sneaking into your browsing history to look for jihadist websites, but querying your car's GPS to find out where you've driven and your child's toy to see what it's heard. The risk that scares me is that since even the best hackers make mistakes, other results could include 100-car pile-ups on the M1. It is a huge mistake to think the investigatory powers bill is just about *data*.

This is why Smith's principle is so important. Anyone in the IT industry knows any product might be hacked. But car and light bulb manufacturers do not think this way, and neither do the makers of myriad other devices that are now becoming "smart" - that is, "open to attack". If everything is fair game for state-sponsored hackers, then everything has adversaries; it doesn't matter which state or what their motives are.

It's tempting to grab at a sports metaphor: say, the difference between figure skating and tennis. Both are competitions against others, but in tennis the adversary is on the court with you trying to frustrate you and use every move you make as an opening they can use to get past your defenses. We must assume that even a washing machine is now playing tennis.

Much of security is traditionally reactive, patched as holes reach significance. We lock our houses because others' houses have been burgled. We hire regulators to ensure that restaurants, drugs, and electrical appliances are safe because others have had food poisoning, been sold snake oil, and died in fires. And yet: some risks we conveniently stop seeing. The flu kills more people than terrorists but we fear it less; even without adversaries at the wheel 32,000 people a year die in car crashes. Instead, we now need to think ahead.

The driver's education I took as a teenager included a segment on defensive driving: assume at all times that the drivers around you might do something stupid or unexpected. This habit is even more important in cycling, where the balance of power between you and the other vehicles on the road is much less in your favor. So what we need, by extension, is defensive computing. Only for "computing" write "design", because increasingly everything is software-driven.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.


TrackBack

TrackBack URL for this entry:
http://WWW.pelicancrossing.net/cgi-sys/cgiwrap/wendyg/managed-mt/mt-tb.cgi/593

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)