« Pwned | Main | Frictional arithmetic »

Bug-a-boo

A few weeks ago, Matt Blaze, the head of the distributed systems lab at the University of Pennsylvania, and Susan Landau, published an opinion piece in Wired arguing that the FBI needs hackers, not back doors. This week, they, with co-authors Steve Bellovin, a professor in computer science at Columbia University, and Sandy Clark, a graduate student in Blaze's lab, published Going Bright: Wiretapping without Weakening Communications Infrastructure, the paper making their arguments more formally.

The gist is straightforward enough: when you pass a law mandating the installation of back doors in communications equipment you create, of necessity, a hole. To you, that may be legal access for interception (wiretapping); to the rest of us it's a security vulnerability that can be exploited by criminals and trolls every bit as effectively as one of those unpatched zero-day bugs that keep getting in the news. Like yesterday's announcement that the Federal Reserve's systems had been hacked. So instead of creating new holes, why not simply develop the expertise and tools to exploit the ones that already exist and seem to be an endemic part of creating complex software?

Yeah, no, they're not joking. April isn't for some time yet.

A little more background. In 1994, the US passed the Communications Assistance for Law Enforcement Act (CALEA). It was promptly followed by legislation mandating lawful interception in Europe, Canada, and others. Since then, law enforcement in those countries has persistently tried to expand the range of services requires to install such equipment. In the UK, the current government proposes the Communications Capabilities Development Programme (CCDP), which would install deep packet inspection (DPI) boxes on ISPs' networks so that all communications can be surveilled.

There are many, many problems with this approach. One is cost; fellow Open Rights Group advisory council member Alec Muffett has done a wonderful job of pointing that out for CCFP in particular. If, he writes, you require a whole lot of small and medium-sized companies to install a proprietary piece of hardware/software that perforce must be frequently updated, you have just given the vendors of these items "a license to print money".

The bigger problem, however, as Landau wrote in 2005 (PDF), is security. A hole is a hole; when a burglar who finds an unlocked door isn't deterred by its having been intended solely for the use of the homeowner. The Internet is different, she argues, and the insecurities you create when you try to apply CALEA to modern telephony - digital, carried over the Internet as one among many flows of data packets rather than over a dedicated direct circuit connection - have far-reaching implications that include serious damage to national security.

Nothing since has invalidated that argument. If you'd like some technical details, here's Cisco describing how it works: as you'll see, the interception element creates two streams, sending one on unaltered and sending the other to the monitoring address. Privacy International's Eric King has exposed the international trade in legally mandated surveillance technologies. Finally, as Blaze, Landau, Clark, and Bellovin write here, recent years have turned up hard evidence that lawful intercept back doors have been exploited. The most famous case is the 2004 pre-Olympic incident in which more than 100 Greek government officials and other dignitaries had their cellphones tapped via software installed on the Vodafone Greece network. So their argument that this approach is dangerous is, I think, well-founded.

The FBI, like other law enforcement services, is complaining that its ability to intercept communications is "going dark". There are many possible responses to that, and many people, including these authors, have made them. Even if they can no longer intercept phone calls with a simple nudge to a guy in a control room at AT&T/BT, they have much, much more data accessible to them from all sorts of source; surveillance has become endemic. And the decades of such complaints make it easy to be jaded about this: it was, 20 years ago, the government's argument why the use of strong cryptography had to be restricted. We know how that turned out: the need to enable electronic commerce won that particular day, and somehow civilization surived.

But if we accept that there is a genuine need for *some* amount of legal wiretapping as a legitimate investigative tool, then what? Hence this paper's suggestion that a less-worse alternative is to encourage the FBI and others to exploit the vulnerabilities that already exist in modern computer systems. Learn, in other words, to hack. Yes, over time those vulnerabilities will get closed up, but there will inevitably be new ones. Like cybercriminals, law enforcement will have to be adept at adapting to changing technology.

The authors admit there are many details to be worked out with respect to policy, limitations, and so on. It seems to me inevitable - because of the way incentives work on humans - that if we pursue this path there will come a point where law enforcement or the security services quietly pressure manufacturers not to fix certain bugs because they've become too useful. And then you start wondering: in this scenario do people like UCL's Brad Karp, whose mission is to design systems with smaller attack surfaces, become enemies of the state?


Wendy M. Grossman's Web site has an extensive archive of her books, articles, and music, and an archive of the earlier columns in this series


TrackBack

TrackBack URL for this entry:
https://WWW.pelicancrossing.net/cgi-sys/cgiwrap/wendyg/managed-mt/mt-tb.cgi/433

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)

Archives