Systemic infection
"Can you keep a record of every key someone enters?"
This question brought author and essayist Ellen Ullman up short when she was still working as a software engineer and it was posed to her circa 1996. "Yes, there are ways to do that," she replied after a stunned pause.
In her 1997 book Close to the Machine, Ullman describes the incident as "the first time I saw a system infect its owner". After a little gentle probing, her questioner, the owner of a small insurance agency, explained that now that he had installed a new computer system he could find out what his assistant, who had worked for him for 26 years and had picked up his children from school when they were small, did all day. "The way I look at it," he explained, "I've just spent all this money on a system, and now I get to use it the way I'd like to."
Ullman appeared to have dissuaded this particular business owner on this particular occasion, but she went on to observe that over the years she saw the same pattern repeated many times. Sooner or later, someone always realizes that they systems they have commissioned for benign purposes can be turned to making checks and finding out things they couldn't know before. "There is something...in the formal logic of programs and data, that recreates the world in its own image," she concludes.
I was reminded of this recently when I saw a report at The Register that the US state of New Jersey, along with two dozen others, may soon require any contractor working on a contract worth more than $100,000 to install keylogging software to ensure that they're actually working all the hours - one imagines that eventually, it will be minutes - they bill for. Veteran reporter Thomas Claburn goes on to note that the text of the bill was provided by TransparentBusiness, a maker of remote work management software, itself a trend.
Speaking as a taxpayer, I can see the point of ensuring that governments are getting full value for our money. But speaking as a freelance writer who occasionally has had to work on projects where I'm paid by the hour or day (a situation I've always tried to avoid by agreeing a rate for the whole job), the distrust inherent in such a system seems poisonous. Why are we hiring people we can't trust? Most of us who have taken on the risks of self-employment do so because one of the benefits is autonomy and a certain freedom from bosses. And now we're talking about the kind of intensive monitoring that in the past has been reserved for full-time employees - and that none of them have liked much either.
One of the first sectors that is already fighting its way through this kind of transition is trucking. In 2014, Cornell sociologist Karen Levy published the results of three years of research into the arrival of electronic monitoring into truckers' cabs as a response to safety concerns. For truckers, whose cabs are literally their part-time homes, electronic monitoring is highly intrusive; effectively, the trucking company is installing a camera and other sensors not just in their office but also in their living room and bedroom. Instead of using electronics to try to change unsafe practices, she argues, alter the economic incentives. In particular, she finds that the necessity of making a living at low per-mile rates pushes truckers to squeeze the unavoidable hours of unpaid work - waiting for loading and unloading, for example - into their statutory hours of "rest".
The result sounds like it would be familiar to Uber drivers or modern warehouse workers, even if Amazon never deploys the wristbands it patented in 2016. In an interview published this week, Data & Society Institute researcher Alex Rosenblat outlines the results of a four-year study of ride-hail drivers across the US and Canada. Forget the rhetoric that these drivers are entrepreneurs, she writes; they have a boss, and it's the company's algorithm, which dictates their on-the-job behavior and withholds the data they need to make informed decisions.
If we do nothing, this may be the future of all work. In a discussion last week, University of Leicester associate professor Phoebe Moore located "quantified work" at the intersection of two trends: first, the health-oriented self-quantified movement, and second the succeeding waves of workplace management from industrialization through time and motion study, scientific management, and today's organizational culture, where, as Moore put it, we're supposed to "love our jobs and identify with our employer". The first of these has led to "wellness" programs that, particularly in the US, helped grant employers access to vastly more detailed personal data about their employees than has ever been available to them before.
Quantification, the combination of the two trends, Moore warns at Medium, will alter the workplace's social values by tending to pit workers against each other, race track style. Vendors now claim predictive power for AI: which prospective employees fit which jobs, or when staff may be about to quit or take sick leave. One can, as Moore does, easily imagine that, despite the improvements AI can bring, the AI-quantified workplace, will be intensively worker-hostile. The infection continues to spread.
Illustrations: HAL, from 2001: A Space Odyssey (1968).
Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.