" /> net.wars: February 2020 Archives

« January 2020 | Main

February 14, 2020

Pushy algorithms

cyberporn.jpgOne consequence of the last three and a half years of British politics, which saw everything sucked into the Bermuda Triangle of Brexit debates, is that things that appeared to have fallen off the back of the government's agenda are beginning to reemerge like so many sacked government ministers hearing of an impending cabinet reshuffle and hoping for reinstatement.

One such is age verification, which was enshrined in the Digital Economy Act (2017) and last seen being dropped to wait for the online harms bill.

A Westminster Forum seminar on protecting children online shortly before the UK's December 2019 general election, reflected that uncertainty. "At one stage it looked as if we were going to lead the world," Paul Herbert lamented before predicting it would be back "sooner or later".

The expectation for this legislation was set last spring, when the government released the Online Harms white paper. The idea was that a duty of care should be imposed on online platforms, effectively defined as any business-owned website that hosts "user-generated content or user interactions, for example through comments, forums, or video sharing". Clearly they meant to target everyone's current scapegoat, the big social media platforms, but "comments" is broad enough to include any ecommerce site that accepts user reviews. A second difficulty is the variety of harms they're concerned about: radicalization, suicide, self-harm, bullying. They can't all have the same solution even if, like one bereaved father, you blame "pushy algorithms".

The consultation exercise closed in July, and this week the government released its response. The main points:

- There will be plentiful safeguards to protect freedom of expression, including distinguishing between illegal content and content that's legal but harmful; the new rules will also require platforms to publish and transparently enforce their own rules, with mechanisms for redress. Child abuse and exploitation and terrorist speech will have the highest priority for removal.

- The regulator of choice will be Ofcom, the agency that already oversees broadcasting and the telecommunications industry. (Previously, enforcing age verification was going to be pushed to the British Board of Film Classification.)

- The government is still considering what liability may be imposed on senior management of businesses that fall under the scope of the law, which it believes is less than 5% of British businesses.

- Companies are expected to use tools to prevent children from accessing age-inappropriate content "and protect them from other harms" - including "age assurance and age verification technologies". The response adds, "This would achieve our objective of protecting children from online pornography, and would also fulfill the aims of the Digital Economy Act."

There are some obvious problems. The privacy aspects of the mechanisms proposed for age verification remain disturbing. The government's 5% estimate of businesses that will be affected is almost certainly a wild underestimate. (Is a Patreon page with comments the responsibility of the person or business that owns it or Patreon itself?). At the Guardian, Alex Hern explains the impact on businesses. The nastiest tabloid journalism is not within scope.

On Twitter, technology lawyer Neil Brown identifies four fallacies in the white paper: the "Wild West web"; that privately operated computer systems are public spaces; that those operating public spaces owe their users a duty of care; and that the offline world is safe by default. The bigger issue, as a commenter points out, is that the privately operated computer systems UK government seeks to regulate are foreign-owned. The paper suggests enforcement could include punishing company executives personally and ordering UK ISPs to block non-compliant sites.

More interesting and much less discussed is the push for "age-appropriate design" as a method of harm reduction. This approach was proposed by Lorna Woods and Will Perrin in January 2019. At the Westminster eForum, Woods explained, "It is looking at the design of the platforms and the services, not necessarily about ensuring you've got the latest generation of AI that can identify nasty comments and take it down."

It's impossible not to sympathize with her argument that the costs of move fast and break things are imposed on the rest of society. However, when she started talking about doing risk assessments for nascent products and services I could only think she's never been close to software developers, who've known for decades that from the instant software goes out into the hands of users they will use it in ways no one ever imagined. So it's hard to see how it will work, though last year the ICO proposed a code of practice.

The online harms bill also has to be seen in the context of all the rest of the monitoring that is being directed at children in the name of keeping them - and the rest of us - safe. DefendDigital.me has done extensive work to highlight the impact of such programs as Prevent, which requires schools and libraries to monitor children's use of the Internet to watch for signs of radicalization, and the more than 20 databases that collect details of every aspect of children's educational lives. Last month, one of these - the Learning Records Service - was caught granting betting companies access to personal data about 28 million children. DefendDigital.me has called for an Educational Rights Act. This idea could be usefully expanded to include children's online rights more broadly.


Illustrations: Time magazine's 1995 "Cyberporn" cover, which marked the first children-Internet panic.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

February 6, 2020

Mission creep

Haystack-Cora.png"We can't find the needles unless we collect the whole haystack," a character explains in the new play The Haystack, written by Al Blyth and in production at the Hampstead Theatre through March 7. The character is Hannah (Sarah Woodward), and she is director of a surveillance effort being coded and built by Neil (Oliver Johnstone) and Zef (Enyi Ororonkwo), familiarly geeky types whose preferred day-off activities are the cinema and the pub, rather than catching up on sleep and showers, as Hannah pointedly suggests. Zef has a girlfriend (and a "spank bank" of downloaded images) and is excited to work in "counter-terrorism". Neil is less certain, less socially comfortable, and, we eventually learn, more technically brilliant; he must come to grips with all three characteristics in his quest to save Cora (Rona Morison). Cue Fleabag: "This is a love story."

The play is framed by an encrypted chat between Neil and Denise, Cora's editor at the Guardian (Lucy Black). We know immediately from the technological checklist they run down in making contact that there has been a catastrophe, which we soon realize surrounds Cora. Even though we're unsure what it is, it's clear Neil is carrying a load of guilt, which the play explains in flashbacks.

As the action begins, Neil and Zef are waiting to start work as a task force seconded to Hannah's department to identify the source of a series of Ministry of Defence leaks that have led to press stories. She is unimpressed with their youth, attire, and casual attitude - they type madly while she issues instructions they've already read - but changes abruptly when they find the primary leaker in seconds. Two stories remain; because both bear Cora's byline she becomes their new target. Both like the look of her, but Neil is particularly smitten, and when a crisis overtakes her, he breaks every rule in the agency's book by grabbing a train to London, where, calling himself "Tom Flowers", he befriends her in a bar.

Neil's surveillance-informed "god mode" choices of Cora's favorite music, drinks, and food when he meets her remind of the movie Groundhog Day, in which Phil (Bill Murray) slowly builds up, day by day, the perfect approach to the women he hopes to seduce. In another cultural echo, the tense beginning is sufficiently reminiscent of the opening of Laura Poitras's film about Edward Snowden, CitizenFour, that I assumed Neil was calling from Moscow.

The requirement for the haystack, Hannah explains at the beginning of Act Two, is because the terrorist threat has changed from organized groups to home-grown "lone wolves", and threats can come from anywhere. Her department must know *everything* if it is to keep the nation safe. The lone-wolf theory is the one surveillance justification Blyth's characters don't chew over in the course of the play; for an evidence-based view, consult the VOX-Pol project. In a favorite moment, Neil and Hannah demonstrate the frustrating disconnect between technical reality and government targets. Neil correctly explains that terrorists are so rare that, given the UK's 66 million population, no matter how much you "improve" the system's detection rate it will still be swamped by false positives. Hannah, however, discovers he has nonetheless delivered. The false positive rate is 30% less! Her bosses are thrilled! Neil reacts like Alicia Florrick in The Good Wife after one of her morally uncomfortable wins.

Related: it is one of the great pleasures of The Haystack that its three female characters (out of a total of five) are smart, tough, self-reliant, ambitious, and good at their jobs.

The Haystack is impressively directed by Roxana Silbert. It isn't easy to make typing look interesting, but this play manages it, partly by the well-designed use of projections to show both the internal and external worlds they're seeing, and partly by carefully-staged quick cuts. In one section, cinema-style cross-cutting creates a montage that fast-forwards the action through six months of two key relationships.

Technically, The Haystack is impressive; Zef and Neil speak fluent Python, algorithms, and Bash scripts, and laugh realistically over a journalist's use of Hotmail and Word with no encryption ("I swear my dad has better infosec"), while the projections of their screens are plausible pieces of code, video games, media snippets, and maps. The production designers and Blyth, who has a degree in econometrics and a background as a research economist, have done well. There were just a few tiny nitpicks: Neil can't trace Cora's shut-down devices "without the passwords" (huh?); and although Neil and Zef also use Tor, at one point they use Firefox (maybe) and Google (doubtful). My companion leaned in: "They wouldn't use that." More startling, for me, the actors who play Neil and Zef pronounce "cache" as "cachet"; but this is the plaint of a sound-sensitive person. And that's it, for the play's 1:50 length (trust me; it flies by).

The result is an extraordinary mix of a well-plotted comic thriller that shows the personal and professional costs of both being watched and being the watcher. What's really remarkable is how many of the touchstone digital rights and policy issues Blyth manages to pack in. If you can, go see it, partly because it's a fine introduction to the debates around surveillance, but mostly because it's great entertainment.


Illustrations: Rona Morison, as Cora, in The Haystack.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.