The threat we left behind
Be careful what systems you build with good intentions. The next owner may not be so kind.
It has long been a basic principle among privacy activists that a significant danger in embedding surveillance technologies is regime change: today's government is benign, but tomorrow's may not be, so let's not build the technologies that could support a police state for that hostile government to wield. Equally - although it's often politic not to say this explicitly - the owner may remain the same but their own intentions may change as the affordances of the system give them new ideas about what it's possible for them to know.
I would be hard-pressed to produce evidence of a direct connection, but one of the ideas floating around Virtual Diplomacy, a 1997 conference that brought together the Internet and diplomacy communities, was that the systems that are privacy-invasive in Western contexts could save lives and avert disasters on the ground in crisis situations. Not long afterwards, the use of biometric identification and other technologies were being built into refugee systems in the US and EU.
In a 2018 article for The New Humanitarian, Paul Currian observes that the systems' development were "driven by the interests of national governments, technology companies, and aid agencies - in that order". Refugees quoted in the article express trust in the UN, but not much understanding of the risks of compliance.
Currian dates the earliest use of "humanitarian biometrics" to 2003 - and identifies the location of that groundbreaking use as...Afghanistan, which iris testing to verify the identities of Afghans returning from Pakistan to prevent fraud. In 2006, then-current, now just-departed, president Ashraf Ghani wrote a book pinpointing biometric identification as the foundation of Afghanistan's social policy. Afghanistan, the article concludes, is "the most biometrically identifiable country in the world" - and, it adds, "although UNHCR and the Afghan government have both invested heavily in biometric databases, the US military has been the real driving force." It bases this latter claim on a 2014 article in Public Intelligence that studies US military documents on the use of biometrics in Afghanistan.
These are the systems that now belong to the Taliban.
Privacy International began warning of the issues surrounding privacy and refugees in the mid-2000s. In 2011, by which time it had been working with UNHCR to improve its practices for four years, PI noted how little understanding there was among funders and the public of why privacy mattered to refugees.
Perhaps it's the word: "privacy" sounds like a luxury, a nice-to-have rather than a necessity, and anyway, how can people held in camps waiting to be moved on to their next location care about privacy when what they need is safety, food, shelter, and a reunion with the rest of their families? PI's answer: "Putting it bluntly, getting privacy wrong will get people arrested, imprisoned, tortured, and may sometimes lead to death." Refugees are at risk from both the countries they're fleeing *from* and the countries they're fleeing *to*, which may welcome and support them - or reject, return, deport, or imprison them, or hold them in bureaucratic purgatory. (As I type this, HIAS president and CEO Mark Hetfield is telling MSNBC that the US's 14-step checking process is stopping Afghan-Americans from getting their families out.)
As PI goes on to explain, there is no such thing as "meaningful consent" in these circumstances. At The New Humanitarian, in a June 2021 article, Zara Rahman agrees. She was responding to a Human Rights Watch report that the United Nations High Commissioner for Refugees had handed a detailed biometric database covering hundreds of thousands of Rohynga refugees to the Myanmar government from which they fled. HRW accused the agency of breaking its own rules for collecting and protecting data, and failing to obtain informed consent; UNHCR denies this charge. But you're desperate and in danger, and UNHCR wants your fingerprint. Can you really say no?
In many countries UNHCR is the organization that determines refugee status. Personal information is critical to this process. The amount of information has increased in some areas to include biometrics; as early as 2008 the US was considering using genetic information to confirm family relationships. More important, UNHCR is not always in control of the information it collects. In 2013, PI published a detailed analysis of refugee data collection in Syria. Last week, it published an even more detailed explanation of the systems built in Afghanistan over the last 20 years and that now have been left behind.
Shortly after the current crisis began, April Glaser and Sephora Smith reported at NBC News that Afghans were hastily deleting photographs and documents on their phones that might link them to Westerners, international human rights groups, the Afghan military, or the recently-departed Afghan government. It's an imperfect strategy: instructions on how to do this in local Afghan languages are not always available, and much of the data and the graph of their social connections are stored on social media that don't necessarily facilitate mass deletions. Facebook has released tools to help, including a one-click locking button and pop-up instructions on Instagram. Access Now also offers help and is telling international actors to close down access to these databases before leaving.
This aspect of the Afghan crisis was entirely avoidable.
Illustrations: Afghan woman being iris-scanned for entry into the Korean hospital at Bagram Airfield, Afghanistan, 2012 (via Wikimedia).
Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.