I've spent a good portion of the last three years listening to smart, highly qualified researchers from a variety of disciplines try to develop real science around cybersecurity. And along in today's news comes exactly the kind of situation they're trying to eliminate.
At Ars TechnicaSean Gallegher reports that the reason Hillary Clinton used her private email server for work email was that the NSA refused to let her have a secure Blackberry like the one they cleared for President Obama. This news was found by Judicial Watch in documents released in response to its FOIA request. In the released communications, NSA dismisses Clinton's request as too expensive and a matter of "personal comfort" rather than real need.
Clinton was, in other words, faced with the same dilemma as millions of people all over the world: do you do your job effectively or do you hobble and frustrate yourself daily by trying to comply with what the IT security department is ordering you to do? The security folks, meanwhile, made the same mistake tens of thousands of their colleagues do every day: they seem to have assumed that their unilateral decisions were so clearly right that they did not need to bother with the realities of human behavior. Electronic Frontier Foundation co-founder John Gilmore's often-repeated aphorism, "The Internet perceives censorship as damage, and routes around it" could be more correctly expanded to, "Humans perceive frustrating rules as blockages, and route around them."
Poorly designed security offers many examples. The rest rooms are positioned outside the secure staff area, so rather than get up repeatedly to let visitors in and out, staff prop the door open with a wastebasket. Staff are banned from using social networks at work, but that's where all their contacts and messages are, so they use a VPN, a personal laptop, or a mobile phone and the company has no idea and no backups. And, everyone's favorite, users are forced to change their passwords every 30 days and, unable to remember the current one, they write it down, recycle earlier passwords, rely on frequent resets, or pick something dumb but easily remembered.
So, in this case: the Secretary of State of the United States of America wants to use a Blackberry. She's used to it, it fits her and her staff's mobile lives, and she doesn't like using desktop computers. Lots of us probably think that's dumb, as in the exasperated quote in one of those documents, "Why doesn't she use her desktop [in the secure area]?" Desktops have nice keyboards, big screens, and mature email clients. They're perfect for people who get lots of email - and spend most of their time in one location. A smart security person listens when the head of a department says, "We need this to do our jobs" because that's where the workaround will happen if they don't. The documents Justice Watch uncovered make it appear that the NSA was completely intransigent in considering Clinton's request.
It seems apposite to invoke the 1967 Cool Hand Luke line "What we've got here is a failure to communicate." You sort of know that security people are reading that same article and thinking, "Users are idiots." There's certainly some justification for that, but as Angela Sasse has written and said so often, security problems can't be solved by "fixing the users" - that is, by issuing orders and forcing them through awareness training. This is a clash of goals. Of course users don't want to see their companies (or governments) hacked and their (state) secrets leaked to the world, but securing those assets is not the job they've been hired to do. Security people, because it's both their job and their bent see security as paramount. In many cases, that makes them rigid, and they fail to work with users to find solutions that enable both sets of goals.
A lot has to do with risk perception. To a journalist, encrypting their hard drive protects their data and their sources - but it also raises the risk that they might lose access to that data while on deadline. Between those colliding objectives, which is a greater risk personal for the journalist? Similarly, one can imagine Clinton and her staff deciding that the risk of missing an essential email until it's too late outweighed the risk of using her personal server.
From what I can see, computer security still languishes in the stage usability was in the early 1990s. Usability improved greatly when companies began hiring anthropologists and psychologists and setting them to watch where users got frustrated with hardware and software designs. (It's worth noting that this is getting worse now - what genius decided to plaster the web with grey type, for example?) Stories like this one cast security as the natural enemy of usability. Instead, security needs to draw on usability's toolkit. Security people need to think about how their rules will be inconvenient. The secure way of doing things needs to be built in from the beginning. It needs to not get in the way. The secure approach needs to be the easiest and most natural one to adopt - not for the security practitioner, but for the users.
Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.