" /> net.wars: January 2020 Archives

« December 2019 | Main

January 17, 2020

Software inside

Hedy_Lamarr_in_The_Conspirators_2.jpgIn 2011, Netscape creator-turned-venture capitalists Marc Andreesen argued that software is eating the world. Andreesen focused on a rather narrow meaning of "world" - financial value. Amazon ate Borders' lunch; software fuels the success of Wal-Mart, Fedex, airlines, and financial services. Like that.

There is, however, a more interesting sense in which software is eating the world, and that's its takeover of what we think of as "hardware". A friend tells me, for example, that part of the pleasure he gets from driving a Tesla is that its periodic software updates keep the car feeling new, so he never looks enviously at the features on later models. Still, these updates do at least sound like traditional software. The last update of 2019, for example, included improved driver visualization, a "Camp Mode" to make the car more comfortable to spend the night in, and other interface improvements. I assume something as ordinarily useful as map updates is too trivial to mention.

Even though this means a car is now really a fancy interconnected series of dozens of computer networks whose output happens to be making a large, heavy object move on wheels. I don't have trouble grasping the whole thing, not really. It's a control system.

Much more confounding was the time, in late 1993. when I visited Demon Internet, then a startup founded to offer Internet access to UK consumers. Like quite a few others, I was having trouble getting connected via the Demon's adapted version of KA9Q, connection software written for packet radio. This was my first puzzlement: how could software for "packet radio" (whatever that was) do anything on a computer? That was nothing to my confusion when Demon staffer Mark Turner explained to me that the computer could parse the stream of information coming into it and direct the results to different applications simultaneously. At that point, I'd only ever used online services where you could only do one thing at a time, just as you could only make one phone call at a time. I remember finding the idea of one data stream servicing many applications at once really difficult to grasp. How did it know what went where?

That is software, and it's what happened in the shift from legacy phone networks' circuit switching to Internet-style packet switching.

I had a similar moment of surreality when first told about software-defined radio. A radio was a *thing*. How could it be software? By then I knew about spread spectrum, invented by the actress Hedy Lamarr and pianist George Antheil to protect wartime military conversations from eavesdropping, so it shouldn't have seemed as weird as it did.

And so to this week, when, at the first PhD Cyber Security Winter School, I discovered programmable - - that is, software-defined - networks. Of course networks are controlled by software already, but at the physical layer it's cables, switches, and routers. If one of those specialized devices needs to be reconfigured you have to do it locally, device by device. Now, the idea is more generic hardware that can be reprogrammed on the fly, enabling remote - and more centralized and larger-scale - control. Security people like the idea that a network can both spot and harden itself against malicious traffic much faster. I can't help being suspicious that this new world will help attackers, too, first by providing a central target to attack, and second because it will be vastly more complex. Authentication and encryption will be crucial in an environment where a malformed or malicious data packet doesn't just pose a threat to the end user who receives it but can reprogram the network. Helpfully, the NSA has thought about this in more depth and greater detail. They do see centralization as a risk, and recommend a series of measures for protecting the controller; they also highlight the problems increased complexity brings.

As the workshop leader said, this is enough of a trend for Cisco, and Intel to embrace it; six months ago, Intel paid $5 billion for Barefoot Networks, the creator of P4, the language I saw demonstrated for programming these things.

At this point I began wondering if this doesn't up-end the entire design philosophy of the Internet, which was to push all the intelligence out to the edges, The beginnings of this new paradigm, active networking, appeared around the early 2000s. The computer science literature - for example, Activating Networks (PDF), by Jonathan M. Smith, Kenneth L. Calvert, Sandra L. Murphy, Hilarie K. Orman, and Larry L. Peterson, and Active Networking: One View of the Past, Present, and Future (PDF), by Smith and Scott M. Nettles - plots out the problems of security and complexity in detail, and considers the Internet and interoperability issues. The Road to SDN: An Intellectual History of Programmable Networks, by Nick Feamster, Jennifer Rexford, and Ellen Zegura, recapitulates the history to date.

My real question, however, is one I suspect has received less consideration: will these software-defined networks make surveillance and censorship easier or harder? Will they have an effect on the accessibility of Internet freedoms? Are there design considerations we should know about? These seem like reasonable questions to ask as this future hurtles toward us.

Illustrations: Hedy Lamarr, in The Conspirators, 1944..

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

January 10, 2020

The forever bug

Bug_de_l'an_2000.jpgY2K is back, and this time it's giggling at us.

For the past few years, there's been a growing drumbeat on social media and elsewhere to the effect that Y2K - "the year 2000 bug" - never happened. It was a nothingburger. It was hyped then, and anyone saying now it was a real thing is like, ok boomer.

Be careful what old averted messes you dismiss; they may come back to fuck with you.

Having lived through it, we can tell you the truth: Y2K *was* hyped. It was also a real thing that was wildly underestimated for years before it was taken as seriously as it needed to be. When it finally registered as a genuine and massive problem, millions of person-hours were spent remediating software, replacing or isolating systems that couldn't be fixed, and making contingency and management plans. Lots of things broke, but, because of all that work, nothing significant on a societal scale. Locally, though, anyone using a computer at the time likely has a personal Y2K example. In my own case, an instance of Quicken continued to function but stopped autofilling dates correctly. For years I entered dates manually before finally switching to GnuCash.

The story, parts of which Chris Stokel-Walker recounts at New Scientist, began in 1971, when Bob Bemer published a warning about the "Millennium Bug", having realized years earlier that the common practice of saving memory space by using two digits instead of four to indicate the year was storing up trouble. He was largely ignored, in part, it appeared, because no one really believed the software they were writing would still be in use decades later.

It was the mid-1990s before the industry began to take the problem seriously, and when they did the mainstream coverage broke open. In writing a 1997 Daily Telegraph article, I discovered that mechanical devices had problems, too.

We had both nay-sayers, who called Y2K a boondoggle whose sole purpose was to boost the computer industry's bottom line, and doommongers, who predicted everything from planes falling out of the sky to total societal collapse. As Damian Thompson told me for a 1998 Scientific American piece (paywalled), the Millennium Bug gave apocalyptic types a *mechanism* by which the crash would happen. In the Usenet newsgroup comp.software.year-2000, I found a projected timetable: bank systems would fail early, and by April 1999 the cities would start to burn... When I wrote that society would likely survive because most people wanted it to, some newsgroup members called me irresponsible, and emailed the editor demanding he "fire this dizzy broad". Reconvening ten years later, they apologized.

Also at the extreme end of the panic spectrum was the then-head of Deutsche Bank, Ed Yardeni, who repeatedly predicted that Y2K would cause a worldwide recession; it took him until 2002 to admit his mistake, crediting the industry's hard work.

It was still a real problem, and with some workarounds and a lot of work most of the effects were contained, if not eliminated. Reporters spent New Year's Eve at empty airports, in case there was a crash. Air travel that night, for sure, *was* a nothingburger. In that limited sense, nothing happened.

Some of those fixes, however, were not so much fixes as workarounds. One of these finessed the rollover problem by creating a "window" and telling systems that two-digit years fell between 1920 and 2020, rather than 1900 and 2000. As the characters on How I Met Your Mother might say: "It's a problem for Future Ted and Future Marshall. Let's let those guys handle it."

So, it's 2020, we've hit the upper end of the window, the bug is back, and Future Ted and Future Marshall are complaining about Past Ted and Past Marshall, who should have planned better. But even if they had...the underlying issue is temporary thinking that leads people to still - still, after all these decades - believe that today's software will be long gone 20 years from now and therefore they need only worry about the short term of making it work today.

Instead, the reality is, as we wrote in 2014, that software is forever.

That said, the reality is also that Y2K is forever, because if the software couldn't be rewritten to take a four-digit year field in 1999 it probably can't be today, either. Everyone stresses the need to patch and update software, but a lot - for an increasing value of "a lot" as Internet of Things devices come on the market with no real idea of how long they were be in service - of things can't be updated for one reason or another. Maybe the system can't be allowed to go down; maybe it's a bespoke but crucial system whose maintainers are long gone; maybe the software is just too fragile and poorly documented to change; maybe old versions propagated all over the place and are laboring on in places where they've simply been forgotten. All of that is also a reason why it's not entirely fair for Stokel-Walker to call the old work "a lazy fix". In a fair percentage of cases, creating and moving the window may have been the only option.

But fret ye not. We will get through this. And then we can look forward to 2038, when the clocks run out in Linux. Future Ted and Future Marshall will handle it.


Illustrations: Millennium Bug manifested at a French school (via Wikimedia).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

January 3, 2020

Chronocentric circles

We wrapped up 2018 with a friend's observation that there was no excitement around technology any more; we conclude the Year of the Bedbug with the regularly heard complaint that the Internet isn't *fun* any more. The writer of this last piece, Brian Koerber, is at least a generation later in arriving online than I was, and he's not alone: where once the Internet was a venue for exploring the weird and unexpected and imagining a hopeful future, increasingly it's a hamster wheel of the same few, mostly commercial, sites and services, which may be entertaining but do not produce any sense of wonder in their quest to exploit us all. Phillip Maciak expands the trend by mourning the death of innovative web publishing, while Abid Omar calls today's web an unusable, user-hostile wasteland. In September, Andres Guadamuz wondered if boredom would kill the Internet; we figure it's a tossup between that and the outrageous energy consumption.

The feeling of sameness is exacerbated by the fact that so many of this year's stories have been mutatis mutandis variations on those of previous years. Smut-detecting automated bureaucrats continue to blame perfectly good names for their own deficiencies, 25 years after AOLbarred users from living in Scunthorpe; the latest is Lyft. Less amusingly, for the ninth year in a row, Freedom House finds that global Internet freedom has declined; of the 65 countries it surveys, only 16 have seen improvement, and that only marginal.

Worse, the year closed with the announcement of perhaps the most evil invention of recent years, the toilet designed to deter lingering. "Most evil", because the meanness is intentional, rather than the result of a gradual drift away from founding values.

Meanwhile, the EU passed a widely disliked copyright-tightening bill. The struggle to change it from threat to opportunity burned out yet another copyright warrior; now-former MEP Julia Reda. It appears increasingly impossible to convince national governments that there is no such thing as a hole - in a wall or in encryption software - that only "good guys" can use (and still less that "good guys" is entirely in the eyes of the beholder). After four years of effort to invent mechanisms for it, age verification may have died...or it may come back as a "duty of care" in whatever legislation builds upon the Online Harms white paper - or in the EU's Audiovisual Media Services Directive. And, nearly three years on, US sites are still ghosting EU residents for fear of GDPR and its potentially massive fines. With the January 1 entry into force of the California Consumer Privacy Act, the US west coast seems set to join us. Hot times for corporate lawyers!

The most noticeable end-of-year trend, however, has been the return of the decade as a significant timeframe and the future as ahead of us. In 2010, the beginning of a decade in which people went from boasting about their smartphones to boasting about how little they used them, no one mentioned the end-of-decade, perhaps because we were all still too startled to be living in the third millennium and the 21st century, known as "the future" for the first decades of my life. Alternatively, perhaps, as a friend suggests, it's because the last couple of years have been so exhausting and depressing that people are clinging to anything that suggests we might now be in for something new.

At Vanity Fair, Nick Bolton has a particularly disturbing view of 2030, and he doesn't even consider climate change, water supplies, the rise of commercial podcasts or cybersecurity.

I would highlight instead a couple of small green shoots of optimism. The profligate wastage exposed by the WeWork IPO appears to be sparking a very real change in both the Silicon Valley venture capital funding ethos (good) and the cost basis of millennial lifestyles (more difficult), or "counterfeit capitalism", as Matt Stoller calls it. Even Wired is suggesting that the formerly godlike technology company founder is endangered. Couple that with 2019's dramatic and continuing rise in employee activism within technology companies and increasing regulatory pressure, particularly on Uber and Airbnb, and there might be some cause to hope for change. Even though company founders like Mark Zuckerberg and Sergey Brin and Larry Page have made themselves untouchable by controlling the majority of voting shares in their companies, they won't *have* companies if they can't retain the talent. The death of the droit de genius ethos that the Jeffrey Epstein case exposed can't come soon enough.

I also note the sudden rebirth of personal and organizational online forums, based on technology such as Mastodon and Soapbox. Some want to focus on specific topics and restrict members to trusted colleagues; some want a lifeboat (paywall) in case of a Twitter ban; WT Social wants to change the game away from data exploitation. Whether any of thesewill have staying power is an open question; a decade ago, when Diaspora, tried to decentralize social media, it failed to gain traction. This time round, with greater consciousness of the true price of pay-with-data "free" services, these return-to-local efforts may have better luck.

Happy new year.

Illustrations:Roborovski hamster (via Wikimedia).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.