« Snark out | Main | The PDP-10 who was God »

Sea of holes

"The Heartbeat security hole is just one more example of why open systems aren't necessarily better," a friend wrote to me this morning.

Not necessarily better, but not necessarily worse, either. Blaming open source for Heartbleed is like blaming dating sites for the divorce rate. Like everything else, it's a tradeoff. As Eric S. Raymond said in 1999, Given enough eyeballs, all bugs are shallow". So the point is not that open source allowed the bug in OpenSSL's Heartbeat function to happen; it's that open source allowed it to be discovered and widely publicized without needing permission from an owner company - and possibly fixed faster. Wired, however, does point out that the economics of open source can make relying on it inherently risky; they predict there will be more such catastrophes unless we can find a way to pay people to actively seek them out.

What allowed Heartbleed to happen is a confluence of things - some blame the C programming language, for example - beginning and ending with a human programming error. No intellectual property regime is ever going to eliminate programming errors.

The basics of Heartbleed - what it is, why you should care, and what you need to do about it - have been helpfully laid out on a vast number of security and general news sites: Scientific American; Sophos; Bruce Schneier; Krebs; Codenomicon, the group that found it. There are more warnings about the future from the New York Times, and you can test a server and, if yours is afflicted, follow EFF's advice for system administrators to fix it.

The short term is a lot of scrambling, patching, fixing, and password changing. What then? Is this to be our collective future? One long series of security warnings until we stop listening and either go offline permanently or assume we're going to be robbed and surveilled every day?

The good news, in a twisted sense, is that the hard work engineers were already beginning to do to harden the Internet against endemic secret agency spying ought to automatically include paying attention to fundamental flaws in the Internet's infrastructure codebase. SSL already had serious enough problems to need a new approach: the bug in OpenSSL can be fixed much more quickly than the endemic problem of the rickety certificate authentication system. Users and even many site administrators do not know what to do when they're told a site's certificate doesn't match or has expired; certificate authorities can be hacked; and so on.

EFF argues that an important piece of fixing the infrastructure lies in implementing Perfect Forward Secrecy. That doesn't fix a vulnerability but does perform the important task of limiting the damage a vulnerability allowing the theft of private cryptographic keys can do. Under the present situation, an attacker using something like Heartbleed (or one of the cute little bugs that sits in place quietly exfiltrating information for years before it's discovered) can patiently pile up data, not caring that it's encrypted and consequently unreadable. Because: eventually, one day, the data captured may be a private key that can be used to retroactively decrypt the lot. This is little different from what hackers do when they patiently spend years writing down information gleaned from wanderings around the Internet until one day a user ID meets up with a password and they're into something like Prince Philip's mailbox, or what a large government agency might do with unlimited storage facilities knowing that the computing power necessary to brute-force keys is increasing daily. What PFS does is ensure that any given key can only decrypt future data, so that although the contents of a current session may be vulnerable when a hole is discovered you can immediately revoke the key and your exposure is limited to that relatively small amount of data.

It's an important point, but it's still just a technical patch. Earlier today, Richard Clayton pointed to a 2008 report he, Ross Anderson, Rainer Böhme, and Tyler Moore wrote for ENISA that, among other things, recommends that to foster systemic change we need to introduce product liability into the software industry. Acknowledging the political and practical difficulties of such a move, the authors suggest, "A good starting point would be to require vendors to certify that their products are secure by default." This would in fact apply to something as arcane as a cryptographic library like OpenSSL, even though it's open source, because users never touch it until it's been implemented in a bigger product that has a vendor that interacts with the user. I like this suggestion, in part because so much of the Internet's expansion has been by patching stuff up along the way, an ideal way to create vulnerabilities.

Cue this quote from Gene Spafford: "The only truly secure system is one that is powered off, cast in a block of concrete and sealed in a lead-lined room with armed guards - and even then I have my doubts."

None of us can live like that. But the way things are right now, there's half a hole in our pockets, and we're left wondering where the other half is.


Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Stories about the border wars between cyberspace and real life are posted throughout the week at the net.wars Pinboard - or follow on Twitter.


TrackBack

TrackBack URL for this entry:
http://WWW.pelicancrossing.net/cgi-sys/cgiwrap/wendyg/managed-mt/mt-tb.cgi/500

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)