Archives

April 16, 2021

Frenemies

London-in-lockdown`20201124_144850.jpgThis week, an update to the UK's contact tracing app (which, confusingly, is labeled "NHS" but is actually instead part of the private contractor-run test and trace system) was blocked by Google and Apple because it broke their terms and conditions. What the UK wanted: people who tested positive to upload their collected list of venue check-ins, now that the latest national lockdown is easing. Under Google's and Apple's conditions, to which the government had agreed: banned. Oops.

The previouslies: this time last year, it was being widely suggested that contact tracing apps could save us. In May 2020, the BMJ blog called downloading the app a "moral obligation".

That reaction was part of a battle over privacy. Step One: Western horror at the Chinese Alipay Health Code app that assigned everyone a traffic light code based on their recent movements and contacts and determined which buildings and public places they could enter - the permission-based society at a level that would surely be unacceptable in a Western democracy. Step Two: the UK, like France, designed its own app to collect users' data for centralized analysis, tracking, and tracing. Privacy advocates argued that this design violated data protection law and that public health goals could be met by less invasive means. Technical advisers warned it wouldn't work. Step Three: Google and Apple built a joint "exposure notification" platform to underpin these contact tracing apps and set the terms: no centralized data collection. Data must remain local unless the user opts to upload it. The UK, and France grumpily switched when they discovered everyone else was right: their design didn't work. Later, the two companies embedded exposure notification into their operating systems so public health departments didn't have to build their own app.

Make no mistake: *contact tracing* works. It's a well-established practice in public health emergencies. But we don't know if contact tracing *apps* work where "work" means "reduce infections" as opposed to work technically, are well-designed, or even reject these silly privacy considerations. Most claimed success for these apps seems to have come shortly after release and measure success in download numbers, on the basis that the apps will only work if enough people use them. The sole exception appears to be Singapore, where claimed download rates near 60% and authorities report the app has halved the time to complete contact tracing from four days to two.

In June, Italian biologist Emanuele Rizzo warned in the British Medical Journal that the apps are poorly suited for the particular characteristics of how the coronavirus spreads and the heightened risk for older people, who are least likely to have smartphones. In October, AI researcher Allison Gardner wrote at The Conversation that the worldwide average for downloading these apps was an inadequate 20%.

The UK was slow to get its contact tracing app working, and by the time it did we were locking down for the winter. Even so, last summer most UK venues posted QR codes for visitors to scan to log their visit. If someone tests positive in that venue it's reported to a database, from where your phone retrieves it and alerts you if you were there at the same time so you can get tested and, if necessary, self-isolate.

Of course, for the last five months nothing's been open. Check-ins and contact tracing apps aren't much use when no one is going anywhere. But during the period when people tried this out, there were many reported problems, such as that the app may decide exposure has taken place when you and the infected person only overlapped briefly. It remains simpler, probably overall cheaper, and more future-proof to improve ventilation and make venues safer.

Google's and Apple's action means, I suppose, that I am supposed to be grateful, however grumpily, to Big Tech for protecting me against government intrusion. What I want, though, to be able to trust the health authorities so this sort of issue only arises when absolutely necessary. Depending on the vagaries of private companies' business models to protect us is not a solution.

This is a time when many are not happy with either company. Google's latest wheeze is to replace third-party cookies with Federated Learning of Cohorts, which assign Chrome users to categories it then uses to target ads. EFF has a new tool that shows if you've been "FLoCed" (Firefox users need not apply). Google calls this setup a privacy sandbox, and claims it will more privacy-protective than the present all-tracking, by-everyone, all-the-time situation. EFF calls this "old tracking" versus "new tracking", and argues for a third option: *not* tracking, and letting users decide what information to share and with whom.

Apple, meanwhile, began blocking tracking via third-party cookies last year, with dramatic results, and rejects apps that aren't compliant, though some companies are finding workarounds. This year, new Apple rules requiring privacy labels that identify the categories of data apps collect have exposed the extent of data collection via Google's Chrome browser and search app.

The lesson to be drawn here is not that these companies are reinventing themselves as privacy protectors. The lesson to be drawn is that each wants to be the *only* one to invade our privacy. It's only a coincidence that the result was that they refused to accommodate government demands.


Illustrations: Empty central London in lockdown in November 2020.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.