Software is forever
The end of official support for Windows XP has occasioned a lot of unsympathetic comments like: Windows 7 (and 8) has fundamentally better built-in security, you should have switched long ago anyway; they gave you years of notice; sheesh, they supported it for 13 years; nothing lasts forever.
The notable dissenter, whom I encountered at the event launching Trustwave's 2014 report, was Matt Palmer, chair of the Channel Islands Information Security Forum, who argued instead that the industry needs a fundamental rethink: "Very few organizations, small or large, can afford to turn over their software estate on a three-to-five-year basis," he said, going on to ask: "Why are they manufacturing software and only supporting it for a short period?"
In other words, as he put it more succinctly afterwards: we need to stop thinking of software as temporary.
This resonates strongly to anyone who remembers that this exact short-term attitude that software was temporary was the precise cause of the Y2K problem. For those who came in late or believe that the moon landings were faked: despite much media silliness (I remember being asked if irons might be affected), Y2K was a genuine problem. It affected many types of both visible and invisible software in some trivial, some serious ways. The root cause was that throughout most of the second half of the 20th century coders saved on precious memory resources by coding two-digit fields to indicate the year. Come 2000, such software couldn't distinguish 1935 from 2035: disambiguation required four-digit fields. "Nothing happened" because coder-millennia were spent fixing code. Remediating Y2K cost $100 billion was spent in the US alone, and all because coders in the 1950s, 1960s, 1970s, 1980s, and even some of the 1990s did not believe their software would still be in use come January 1, 2000. The date of the earliest warning not to think like that? A 1979 paper by Bob Bemer.
Thirteen years ago, when XP was launched, computer hardware was still relatively expensive. It was before cheap netbooks, let alone tablets or smartphones. But unlike other high-cost items at the time, most people knew that computers were advancing very quickly, and that the machine they bought was more likely to become obsolete than wear out. At that time you bought a new computer because you got frustrated when your old machine couldn't keep up with the demands of the new things you wanted to be able to do.
That is not true any more except at the high end. My current desktop machine dates to 2007, although it's had facelifts since then: a couple of bigger hard drives and more memory. The depth of the change in attitudes became apparent in 2012, when a double power outage blatted the graphics card. Even technical people couldn't see any reason to replace anything more than just that graphics card. Ten years ago, everyone would have been telling me to replace the whole machine because it was so outdated. Even now, although I'm sure I'd be impressed by the increased speed of a new one, the old one has no apparent limitations. So when the Windows 8 upgrade dingus says I need a new PC, I beg to differ.
Tablets and smartphones, which are still changing and adding capabilities at a rapid pace, are a different story. For now: the reality is that these segments of the industry, too, will mature and slow down when they reach the point where most people find their existing equipment is adequate for their needs.
You can still argue that XP is dead, get over it. But in the longer term living with old software is where we're going. This is particularly the case in regard to the "Internet of Things" vendors are so eager to build.
A few weeks ago, I was at an entertaining lunchtime demonstration by NCC Group that made this point nicely. The team reprogrammed hotel door locks using an NFC-enabled smartphone, attacked broadband routers and Homeplugs, and turned a smart TV into an audio/video surveillance device.
The group listed four main issues
- Embedded software designers still assume that only machines will communicate with their devices; they don't plan for malicious humans and therefore tend to think security does not matter.
- Embedded software designers still think "security by obscurity" works.
- Vulnerabilities are likely to persist for many years, since even if firmware updates become available, no one will risk bricking their TV or car by applying them. The Homeplugs, for example, which carry networking around the house via the electrical wiring, all have the same default password, which you can only change via an elaborate procedure that the manufacturer warns could make the network inoperable. What's that line you always hear on TV? Oh, yes: don't try this at home.
- Interoperability always trumps security.
And this: analysts are predicting 20 billion Internet of Things devices by 2020.
People expect to measure the lives of refrigerators, thermostats, cars, or industrial systems in decades, not months or years. Even if you want to say it's unreasonable and stupid that people and companies still have old XP boxes running specialized, irreplaceable applications today, one day soon it's your attitude that will be unreasonable. Software has a much longer lifespan than its coders like to think about, and this will be increasingly true.
Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.