"We will make that person job-ready," Mary Keeling, IBM's manager for economic analysis and smarter cities, promised at last week's Westminster eForum seminar on smart cities, speaking of what could be done with the right big data analytics. I might - only might - have let that pass without notice if I hadn't been, not long ago, at the Cybersalon event on the same topic. There, Usman Haque pointed out the work ethic-oriented rhetoric surrounding IBM's marketing of its "smarter planet". I've heard those ads for years, but noticed the music more than that underlying message.
Haque went on to comment that the technology companies selling smart cities tend to see them as a problem to be solved.
"What makes a city valuable is the unpredictability, diversity, and heterogeneity, all the stuff you don't expect and the people you disagree with," he said. "But that's not what the technology companies are selling you. They are providing systems for convenience, predictability, optimization, security. And that's how they're selling the technology." (As Don Draper might have said, "What's the happiness of solving a problem? The moment before you need more happiness and a new solution.") Haque again: "The same language was used in the 1950s and 1960s to sell highways and high-rises - ways that cities will become more efficient. Now we understand that highways have had untold consequences...and high-rises have had to be knocked down."
Frank Kresin, research director for Amsterdam's Waag Society, similarly said, "Smart cities are not about technology but should be used to empower people. What is smart about a city is the smart people who live there."
Hey, that's *us*!!
So I was newly sensitized by those comments when I heard "job-ready", which, no matter how much the person concerned wants work, doesn't sound like something you do to a person - it sounds like something you do to a machine tool. If smart cities are to be the vibrant, attractive places that proponents suggest, they will have to be more human than that. Which is what Haque was arguing for: the Engaged City. The city where people are enabled to collaborate on shared challenges rather than having their planner-specified problems "solved" by remote black-box algorithms and technologies. The comic version of Haque's concern might be the transport planner in Twenty Twelve, who was convinced he could unsnarl London's traffic if his policies were just applied to a large enough area. The serious version is the project Haque cited which aimed to replace traffic lights with interactive signs on the road. Do you want people staring at the road or looking up and around them for oncoming traffic?
Anyone who's been around information technology systems for any length of time knows that their success depends on their being embraced by their users - and that users will only embrace them if they have been engaged in the development process and the systems offer them genuine benefits. To take as an example Oyster cards: people did not care whether they had a smart card or a paper ticket as long as they could get on a bus or tube quickly. London Underground added a bit of encouragement by making Oyster fares cheaper than the cash equivalent; and people quickly found it convenient not to have to buy a new ticket for every journey. In this time of steady reports of data breaches, the incremental change of dropping Oyster cards in favor of paying directly with bank cards may not receive as wide acceptance among locals - but will likely be loved by visitors.
Technology systems sold by vendors to government agencies and departments typically see the staff on those agencies and departments as the users of the systems they sell. To some extent that's correct: they are the people who have to run the software every day and live with its bugs and quirks. So they are an important group to consider. However, the *actual* users of these systems are all of us: the people who must tangle with DWP to get state benefits and pensions; the people whose children are experimented upon with each new education policy; the residents who do not get asked which hours they need parking to be restricted.
The effect of this way of thinking will be seen especially in its impact on individual privacy, which was not discussed at the Westminster eForum at all except in the context of compliance with data protection law. The forerunner of tomorrow's smart cities are systems like Oyster and like today's network of ANPR cameras, which began with London's congestion charge, all of which could have been designed with privacy and anonymity in mind but which in general are instead maximally invasive. In this context, it's worth recalling that many measurement systems of the past - IQ testing, for example - was intended for use to identify people who needed help but have morphed into systems used to rank everyone - a purpose for which these metrics were never designed. In today's accelerated world, technologies and companies come and go very quickly, and jobs appear and disappear again in response. Big data analytics are only as good as the data you feed them; today's "job-ready" person may be tomorrow's between-stairs maid.
Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.