« Block that metaphor | Main | Humans »

Old thinking

sidewalklabs-streetcrossing.png"New technology is doing very old work," Center for Media Justice executive director Malkia Cyril told Computers, Freedom, and Privacy in 2015. reminding the mostly white, middle-class attendees that the level of surveillance that was newly outraging them has long been a fact of life for African-American and Muslim communities in the US.

Last week, at the Personal Democracy Forum, Matt Mitchell, the founder of Crypto Harlem, made a similar point in discussing menacing cities (start at 1:09:00), his term for what are more often called "smart cities". A particularly apt example was his discussion of Google's plans for Toronto's waterfront. The company's mock-up shows an old woman crossing the street very slowly and reactive traffic lights responding by delaying turning green to give her time to get across. Now, my own reaction is to think that all the kids and self-important people in the neighborhood will rapidly figure out there's a good game in blocking the cars from ever getting through. Mitchell's is to see immediately that the light controls can see people and watch their movements. He is certainly right, and he would be rightly hard to convince that the data will only be used for everyone's good.

Much has already been published about bias in what technology companies call predictive policing and Stop LAPD Spying Coalition leader Hamid Khan called "speculative policing" at that same CFP 2015. Mitchell provided background that, as he said, is invisible to those who don't live near US public housing apartment blocks. I have never seen, as he has, police vans parked outside the houses on my street in order to shoot lights into the windows at night on the basis that darkness fosters crime. "You will never see it if you don't walk through these neighborhoods." It is dangerous to assume it will never happen to you.

"Smart" - see Bruce Sterling's completely correct rant about this terminology - cities would embed all this inside the infrastructure and make the assumptions behind it and its operations largely invisible. The analogy that occurs to mind is those elevators you encounter in "smart buildings" where the buttons to choose your floor are all on the outside. It optimizes life for the elevators, but for a human rider it's unnerving to find no controls except buttons to open the doors or sound an alarm. Basically, agency ends at the elevator door. The depressing likelihood is that this is how smart cities will be built, too. Cue Sterling: "The language of Smart City is always Global Business English, no matter what town you're in."

"Whose security?" Royal Holloway professor Lizzie Coles-Kemp often asks. Like Mitchell, she works with underserved communities - in her case, in Britain, with families separated by prison, and the long-term unemployed. For so many, "security" means a hostile system designed as if they are attackers, not users. There is a welcome trend of academics and activists working along these lines: David Corsar is working with communities near Aberdeen to understand what building trust into the Internet of Things means to them; at Bucknell University in Pennsylvania Darakhshan Mir is leading a project on community participation in privacy decisions. Mitchell himself works to help vulnerable communities protect themselves against surveillance.

Technological change is generally sold to us as advances: more efficient, safer, fairer, and cheaper. Deployment offers the chance to reimagine processes. Yet so often the underpinning thinking goes unexamined. In a tiny example at this week's EEMA conference, a speaker listed attributes banks use to digitally identify customers. Why is gender among them, asked one woman. The speaker replied that the system was GDPR-compliant. Not the point: why use an attribute that for an increasing number of people is fluid? (My own theory is that in the past, formalities meant staff needed gender in order to know how to address you in a letter, and now everyone collects it because they always have.)

Bigger examples have long been provided by David Alexander, the co-founder of Mydex, a community interest company that has been working on personal data stores for the last decade-plus. In behavior Alexander has dubbed "organizational narcissism", services claim to be "user-centric" when they're really "producer-centric". Despite the UK government's embrace of "digital", for example, it still retains the familiar structure in which it is the central repository of data and power, while we fill out their forms to suit their needs. At EEMA, Microsoft's architect of identity, Kim Cameron, was also talking moving control into our hands. Microsoft (and Cameron) has been working on this for more than 15 years, first as CardSpace (canceled 2011), then as U-Prove (silent since 2014). Another push seems to be imminent, but it's a slow, hard road to up-end the entrenched situation. What "disruption" is this if the underlying structures remain the same?

Back to Mitchell, who had just discussed the normalization of whiteness as the default when black people are omitted from machine learning training datasets: "Every time someone tells you about an amazing piece of technology, you have to remember there's an equally amazing horrible thing inside of it and if we don't train ourselves to think this way we're going to end up in a bad future."


Illustrations: "Old woman" crosses the street in Sidewalk Labs' plans for Toronto's waterfront.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

TrackBack

TrackBack URL for this entry:
http://WWW.pelicancrossing.net/cgi-sys/cgiwrap/wendyg/managed-mt/mt-tb.cgi/780

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)

Archives