" /> net.wars: February 2019 Archives

« January 2019 | Main | March 2019 »

February 28, 2019

Systemic infection

Thumbnail image for 2001-hal.png"Can you keep a record of every key someone enters?"

This question brought author and essayist Ellen Ullman up short when she was still working as a software engineer and it was posed to her circa 1996. "Yes, there are ways to do that," she replied after a stunned pause.

In her 1997 book Close to the Machine, Ullman describes the incident as "the first time I saw a system infect its owner". After a little gentle probing, her questioner, the owner of a small insurance agency, explained that now that he had installed a new computer system he could find out what his assistant, who had worked for him for 26 years and had picked up his children from school when they were small, did all day. "The way I look at it," he explained, "I've just spent all this money on a system, and now I get to use it the way I'd like to."

Ullman appeared to have dissuaded this particular business owner on this particular occasion, but she went on to observe that over the years she saw the same pattern repeated many times. Sooner or later, someone always realizes that they systems they have commissioned for benign purposes can be turned to making checks and finding out things they couldn't know before. "There is something...in the formal logic of programs and data, that recreates the world in its own image," she concludes.

I was reminded of this recently when I saw a report at The Register that the US state of New Jersey, along with two dozen others, may soon require any contractor working on a contract worth more than $100,000 to install keylogging software to ensure that they're actually working all the hours - one imagines that eventually, it will be minutes - they bill for. Veteran reporter Thomas Claburn goes on to note that the text of the bill was provided by TransparentBusiness, a maker of remote work management software, itself a trend.

Speaking as a taxpayer, I can see the point of ensuring that governments are getting full value for our money. But speaking as a freelance writer who occasionally has had to work on projects where I'm paid by the hour or day (a situation I've always tried to avoid by agreeing a rate for the whole job), the distrust inherent in such a system seems poisonous. Why are we hiring people we can't trust? Most of us who have taken on the risks of self-employment do so because one of the benefits is autonomy and a certain freedom from bosses. And now we're talking about the kind of intensive monitoring that in the past has been reserved for full-time employees - and that none of them have liked much either.

One of the first sectors that is already fighting its way through this kind of transition is trucking. In 2014, Cornell sociologist Karen Levy published the results of three years of research into the arrival of electronic monitoring into truckers' cabs as a response to safety concerns. For truckers, whose cabs are literally their part-time homes, electronic monitoring is highly intrusive; effectively, the trucking company is installing a camera and other sensors not just in their office but also in their living room and bedroom. Instead of using electronics to try to change unsafe practices, she argues, alter the economic incentives. In particular, she finds that the necessity of making a living at low per-mile rates pushes truckers to squeeze the unavoidable hours of unpaid work - waiting for loading and unloading, for example - into their statutory hours of "rest".

The result sounds like it would be familiar to Uber drivers or modern warehouse workers, even if Amazon never deploys the wristbands it patented in 2016. In an interview published this week, Data & Society Institute researcher Alex Rosenblat outlines the results of a four-year study of ride-hail drivers across the US and Canada. Forget the rhetoric that these drivers are entrepreneurs, she writes; they have a boss, and it's the company's algorithm, which dictates their on-the-job behavior and withholds the data they need to make informed decisions.

If we do nothing, this may be the future of all work. In a discussion last week, University of Leicester associate professor Phoebe Moore located "quantified work" at the intersection of two trends: first, the health-oriented self-quantified movement, and second the succeeding waves of workplace management from industrialization through time and motion study, scientific management, and today's organizational culture, where, as Moore put it, we're supposed to "love our jobs and identify with our employer". The first of these has led to "wellness" programs that, particularly in the US, helped grant employers access to vastly more detailed personal data about their employees than has ever been available to them before.

Quantification, the combination of the two trends, Moore warns at Medium, will alter the workplace's social values by tending to pit workers against each other, race track style. Vendors now claim predictive power for AI: which prospective employees fit which jobs, or when staff may be about to quit or take sick leave. One can, as Moore does, easily imagine that, despite the improvements AI can bring, the AI-quantified workplace, will be intensively worker-hostile. The infection continues to spread.


Illustrations: HAL, from 2001: A Space Odyssey (1968).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

February 22, 2019

Metropolis

Metropolis-openingshot.png"As a citizen, how will I know I live in a smarter city, and how will life be different?" This question was probably the smartest question asked at yesterday's Westminster Forum seminar on smart cities (PDF); it was asked by Tony Sceales, acting as moderator.

"If I feel safe and there's less disruption," said Peter van Manen. "You won't necessarily know. Thins will happen as they should. You won't wake up and say, 'I'm in the city of the future'," said Sam Ibbott. "Services become more personalized but less visible," said Theo Blackwell the Chief Digital Office for London.

"Frictionless" said Jacqui Taylor, offering it as the one common factor she sees in the wildly different smart city projects she has encountered. I am dubious that this can ever be achieved: one person's frictionless is another's desperate frustration: streets cannot be frictionless for *both* cars and cyclists, just as a city that is predicted to add 2 million people over the next ten years can't simultaneously eliminate congestion. "Working as intended" was also heard. Isn't that what we all wish computers would do?

Blackwell had earlier mentioned the "legacy" of contactless payments for public transport. To Londoners smushed into stuffed Victoria Line carriages in rush hour, the city seems no smarter than it ever was. No amount of technological intelligence can change the fact that millions of people all want to go home at the same time or the housing prices that force them to travel away from the center to do so. We do get through the ticket barriers faster.

"It's just another set of tools," said Jennifer Schooling. "It should feel no different."

The notion of not knowing as the city you live in smartens up should sound alarm bells. The fair reason for that hiddenness is the reality that, as Sara Degli Esposti pointed out at this year's Computers, Privacy, and Data Protection, this whole area is a business-to-business market. "People forget that, especially at the European level. Users are not part of the picture, and that's why we don't see citizens engaged in smart city projects. Citizens are not the market. This isn't social media."

She was speaking at CPDP's panel on smart cities and governance, convened by the University of Stirling's William Webster, who has been leading a research project, CRISP, to study these technologies. CRISP asked a helpfully different question: how can we use smart city technologies to foster citizen engagement, coproduction of services, development of urban infrastructure, and governance structures?

The interesting connection is this: it's no surprise when CPDP's activists, regulators, and academics talk about citizen engagement and participation, or deplore a model in which smart cities are a business-led excuse for corporate and government, surveillance. The surprise comes when two weeks later the same themes arise among Westminster Forum's more private and public sector speakers and audience. These are the people who are going to build these new programs and services, and they, too, are saying they're less interested in technology and more interested in solving the problems that keep citizens awake at night: health, especially.

There appears to be a paradigm shift beginning to happen as municipalities begin to seriously consider where and on what to spend their funds.

However, the shift may be solely European. At CPDP, Canadian surveillance studies researcher David Murakami Wood told the story of Toronto, where (Google owner) Alphabet subsidiary Sidewalk Labs swooped in circa 2014 with proposals to redevelop the Quayside area of Toronto in partnership with Waterfront Toronto. The project has been hugely controversial - there were hearings this week in Ottawa, the provincial capital.

As Murakami Wood's tells it, for Sidewalk Labs the area is a real-world experiment using real people's lives as input to create products the company can later sell elsewhere. The company has made clear it intends to keep all the data the infrastructure generates on its servers in the US as well as all the intellectual property rights. This, Murakami Wood argued, is the real cost of the "free" infrastructure. It is also, as we're beginning to see elsewhere, the extension of online tracking or, as Murakami Wood put it, surveillance capitalism into the physical world: cultural appropriation at municipal scale from a company that has no track record in building buildings, or even publishing detailed development plans. Small wonder that Murakami Wood laughed when he heard Sidewalk Labs CEO Dan Doctoroff impress a group of enthusiastic young Canadian bankers with the news that the company had been studying cities for *two years*.

Putting these things together, we have, as Andrew Adams suggested, three paradigms, which we might call US corporate, Chinese authoritarian, and, emerging, European participatory and cooperative. Is this the choice?

Yes and no. Companies obviously want to develop systems once, sell them everywhere. Yet the biggest markets are one-off outliers. "Croydon," said Blackwell, "is the size of New Orleans." In addition, approaches vary widely. Some places - Webster mentioned Glasgow - are centralized command and control; others - Brazil - are more bottom-up. Rick Robinson finds that these do not meet in the middle.

The clear takeaway overall is that local context is crucial in shaping smart city projects and despite some common factors each one is different. We should built on that.


Illustrations: Fritz Lang's Metropolis (1927).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

February 14, 2019

Copywrong

Anti-copyright.svg.pngJust a couple of weeks ago it looked like the EU's proposed reform of the Copyright Directive, last updated in 2001, was going to run out of time. In the last three days, it's revived, and it's heading straight for us. As Joe McNamee, the outgoing director of European Digital Rights (EDRi), said last year, the EU seems bent on regulating Facebook and Google by creating an Internet in which *only* Facebook and Google can operate.

We'll start with copyright. As previously noted, the EU's proposed reforms include two particularly contentious clauses: Article 11, the "link tax", which would require anyone using more than one or two words to link to a news article elsewhere to get a license, and Article 13, the "upload filter", which requires any site older than three years *or* earning more than €10,000,000 a year in revenue to ensure that no user posts anything that violates copyright, and sites that allow user-generated content must make "best efforts" to buy licenses for anything they might post. So even a tiny site - like net.wars, which is 13 years old - that hosted comments would logically be required to license all copyrighted content in the known universe, just in case. In reviewing the situation at TechDirt, Mike Masnick writes, "If this becomes law, I'm not sure Techdirt can continue publishing in the EU." Article 13, he continues, makes hosting comments impossible, and Article 11 makes their own posts untenable. What's left?

Thumbnail image for Thumbnail image for Julia Reda-wg-2016-06-24-cropped.jpgTo these known evils, the German Pirate Party MEP Julia Reda finds that the final text adds two more: limitations on text and data mining that allow rights holders to opt out under most circumstances, and - wouldn't you know it? - the removal of provisions that would have granted authors the right to proportionate remuneration (that is, royalties) instead of continuing to allow all-rights buy-out contracts. Many younger writers, particularly in journalism, now have no idea that as recently as 1990 limited contracts were the norm; the ability to resell and exploit their own past work was one reason the writers of the mid-20th century made much better livings than their counterparts do now. Communia, an association of digital rights organizations, writes that at least this final text can't get any *worse*.

Well, I can hear Brexiteers cry, what do you care? We'll be out soon. No, we won't - at least, we won't be out from under the Copyright Directive. For one thing, the final plenary vote is expected in March or April - before the May European Parliament general election. The good side of this is that UK MEPs will have a vote, and can be lobbied to use that vote wisely; from all accounts the present agreed final text settled differences between France and Germany, against which the UK could provide some balance. The bad side is that the UK, which relies heavily on exports of intellectual property, has rarely shown any signs of favoring either Internet users or creators against the demands of rights holders. The ugly side is that presuming this thing is passed before the UK brexits - assuming that happens - it will be the law of the land until or unless the British Parliament can be persuaded to amend it. And the direction of travel in copyright law for the last 50 years has very much been toward "harmonization".

Plus, the UK never seems to be satisfied with the amount of material its various systems are blocking, as the Open Rights Group documented this week. If the blocks in place weren't enough, Rebecca Hill writes at the Register: under the just-passed Counter-Terrorism and Border Security Act, clicking on a link to information likely to be useful to a person committing or preparing an act of terrorism is committing an offense. It seems to me that could be almost anything - automotive listings on eBay, chemistry textbooks, a *dictionary*.

What's infuriating about the copyright situation in particular is that no one appears to be asking the question that really matters, which is: what is the problem we're trying to solve? If the problem is how the news media will survive, this week's Cairncross Review, intended to study that exact problem, makes some suggestions. Like them or loathe them, they involve oversight and funding; none involve changing copyright law or closing down the Internet.

Similarly, if the problem is market dominance, try anti-competition law. If the problem is the increasing difficulty of making a living as an author or creator, improve their rights under contract law - the very provisions that Reda notes have been removed. And, finally, if the problem is the future of democracy in a world where two companies are responsible for poisoning politics, then delving into campaign finances, voter rights, and systemic social inequality pays dividends. None of the many problems we have with Facebook and Google are actually issues that tightening copyright law solves - nor is their role in spreading anti-science, such as this, just in from Twitter, anti-vaccination ads targeted at pregnant women.

All of those are problems we really do need to work on. Instead, the only problem copyright reform appears to be trying to solve is, "How can we make rights holders happier?" That may be *a* problem, but it's not nearly so much worth solving.


Illustrations: Anti-copyright symbol (via Wikimedia); Julia Reda MEP in 2016.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

February 8, 2019

Doing without

kashmir-hill-untech-gizmodo.pngOver at Gizmodo, Kashmir Hill has conducted a fascinating experiment: cutting, in turn, Amazon, Facebook, Google, Microsoft, and Apple, culminating with a week without all of them. Unlike the many fatuous articles in which privileged folks fatuously boast about disconnecting, Hill is investigating a serious question: how deeply have these companies penetrated into our lives? As we'll see, this question encompasses the entire modern world.

For that reason, it's important. Besides, as Hill writes, it's wrong to answer objections to GAFAM's business practices - or their privacy policies - with, "Well, don't use them, then." It may be to buy from smaller sites and local suppliers, delete Facebook, run Linux, switch to AskJeeves and OpenStreetMap, and dump the iPhone, but doing so requires a substantial rethink of many tasks. As regulators consider curbing GAFAM's power, Hill's experiment shows where to direct our attention.

Online, Amazon is the hardest to avoid. As Lina M. Khan documented last year, Amazon underpins an ever-increasing amount of Internet infrastructure. Netflix, Signal, the WELL, and Gizmodo itself all run on top of Amazon's cloud services, AWS. To ensure she blocked all of them, Hill got a technical expert to set up a VPN that blocked all IP addresses owned by each company and monitored attempted connections. Even that, however, was complicated by the use of content delivery networks, which mask the origin of network traffic.

Barring Facebook also means dumping Instagram and WhatsApp, and, as Hill notes, changing the signin procedure for any website where you've used your Facebook ID. Even if you are a privacy-conscious net.wars reader who would never grant Facebook that pole position, the social media buttons on most websites and ubiquitous trackers also have to go.

For Hill, blocking Apple - which seems easy to us non-Apple users - was "devastating". But this is largely a matter of habit, and habits can be re-educated. The killer was the apps: because iMessage reroutes texts to its own system, some of Hill's correspondents' replies never arrive, and she can't FaceTime her friends. Her conclusion: "It's harder to get out of Apple's ecosystem than Google's." However, once out she found it easy to stay that way - as long as she could resist her friends pulling her back in.

Google proved easier than expected despite her dependence on its services - Maps, calendar, browser. Here the big problem was email. The amount of stored information made it impossible to simply move and delete the account; now we know why Google provides so much "free" storage space. Like Amazon, the bigger issue was all the services Google underpins - trackers, analytics, and, especially, Maps, which Uber, Lyft, and Yelp depend. Hill should be grateful she didn't have a Nest thermostat and doesn't live in Minnesota. The most surprising bit is that so many sites load Google *fonts*. Also, like Facebook, Google has spread logins across the web, and Hill had to find an alternative to Dropbox, which uses Google to verify users.

In our minds, Microsoft is like Apple. Don't like Windows? Get a Mac or use Linux. Ah, but: I have seen the Windows Blue Screen of Death on scheduling systems on both the London Underground and Philadelphia's SEPTA. How many businesses that I interact with depend on Microsoft products? PCs, Office, and Windows servers and point of sale systems are everywhere. A VPN can block LinkedIn, Skype, and (sadly) Github - but it can't block any of those - or the back office systems at your bank. You can sell your Xbox, but even the local film society shows movies using VLC on Windows.

Hill's final episode, in which she eliminates all five simultaneously, posted just last night. As expected, she struggles to find alternative ways to accomplish many tasks she hasn't had to think about before. Ironically, this is easier if you're an Old Net Curmudgeon: as soon as she says large file, can't email, I go, "FTP!" while various web services all turn out to behosted on AWS, and she eventually lands on "command line". It's a definite advantage if you remember how you did stuff *before* the Internet - cash can pay the babysitter (or write a check!), and old laptops can be repurposed to run Linux. Even so, complete avoidance is really only realistic for a US Congressman. The hardest for me personally would be giving up my constant compaion, DuckDuckGo, which is hosted on...AWS.

Several things need to happen to change this - and we *should* change it because otherwise we're letting them pwn us, as in Dave Eggers' The Circle. The first is making the tradeoffs visible, so that we understand who we're really benefiting and harming with our clicks. The second is also regulatory: Lina Khan described in 2017 how to rethink antitrust law to curb Amazon. Facebook, as Marc Rotenberg told CNBC last week, should be required to divest Instagram and WhatsApp. Both Facebook and Google should spin off or discontinue their identity verification and web-wide login systems into separate companies. Third, we should encourage alternatives by using them.

But the last thing is the hardest: we must convince all our friends that it's worth putting up with some inconvenience. As a lifelong non-drinker living in pub-culture Britain, I can only say: good luck with that.


Illustrations: Kashmir Hill and her new technology.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

February 1, 2019

Beyond data protection

3rd-cpdp2019-sign.jpgFor the group assembled this week in Brussels for Computers, Privacy, and Data Protection, the General Data Protection Regulation that came into force in May 2018 represented the culmination of years of effort. The mood, however, is not so much self-congratulatory as "what's next?".

The first answer is a lot of complaints. An early panel featured a number of these. Max Schrems, never one to shirk, celebrated GDPR day in 2018 by joining with La Quadrature du Net to file two complaints against Google, WhatsApp, Instagram, and Facebook over "forced consent". Last week, he filed eight more complaints against Amazon, Apple, Spotify, Netflix, YouTube, SoundCloud, DAZN, and Flimmit regarding their implementation of subject access rights. A day or so later, the news broke: the French data protection regulator, CNIL, has fined Google €50 million (PDF) on the basis of their complaint - the biggest fine so far under the new regime that sets the limit at 4% of global turnover. Google is considering an appeal.

It's a start. We won't know for probably five years whether GDPR will have the intended effect of changing the balance of power between citizens and data-driven companies (even though one site is already happy to call it a failure already. Meanwhile, one interesting new development is Apple's crackdown on Facebook and then Google for abusing its enterprise app system to collect comprehensive data on end users. While Apple is certainly far less dependent on data collection than the rest of GAFA/FAANG, this action is a little like those types of malware that download anti-virus software to clean your system of the competition.

The second - more typical of a conference - is to stop and think: what doesn't GDPR cover? The answers are coming fast: AI, automated decision-making, household or personal use of data, and (oh, lord) blockchain. And, a questioner asked late on Wednesday, "Is data protection privacy, data, or fairness?"

Several of these areas are interlinked: automated decision-making is currently what we mean when we say "AI", and while we talk a lot about the historical bias stored in data and the discrimination that algorithms derive from training data and bake into their results. Discussions of this problem, Angsar Koene tend to portray accuracy and fairness as a tradeoff, with accuracy presented as a scientifically neutral reality and fairness as a fuzzy human wish. Instead, he argued, accuracy depends on values we choose to judge it by. Why shouldn't fairness just be one of those values?

A bigger limitation - which we've written about here since 2015 - is that privacy law tends to focus on the individual. Seda Gürses noted that focusing on the algorithm - how to improve it and reduce its bias - similarly ignores the wider context and network externalities. Optimize the Waze algorithm so each driver can reach their destination in record time, and the small communities whose roads were not built for speedy cut-throughs bear the costs of extra traffic, noise, and pollution they generate. Next-generation privacy will have to reflect that wider context; as Dennis Hirsch put it, social protection rather than individual control. As Schrems' and others' complaints show, individual control is rarely ours on today's web in any case.

Privacy is not the only regulation that suffers from that problem. At Tuesday's pre-conference Privacy Camp, several speakers deplored the present climate in which platforms' success in removing hate speech, terrorist content, and unauthorized copyright material is measured solely in numbers: how many pieces, how fast. Such a regime does not foster thoughtful consideration, nuance, respect for human rights, or the creation of a robust system of redress for the wrongly accused. "We must move away from the idea that illegal content can be perfectly suppressed and that companies are not trying hard enough if they aren't doing it," Mozilla Internet policy manager Owen Bennett said, going on to advocate for a wider harm reduction approach.

The good news, in a way, is that privacy law has fellow warriors: competition, liability, and consumer protection law. The first two of those, said Mireille Hildebrandt need to be rethought, in part because some problems will leave us no choice. She cited, for example, the energy market: as we are forced to move to renewables both supply and demand will fluctuate enormously. "Without predictive technology I don't see how we can solve it." Continuously predicting the energy use of each household will, she wrote in a paper in 2013 (PDF), pose new threats to privacy, data protection non-discrimination, and due process.

One of the more interesting new (to me, at least) players on this scene is Algorithm Watch, which has just released a report on algorithmic decision-making in the EU that recommends looking at other laws that are relevant to specific types off decisions, such as applying equal pay legislation to the gig economy. Data protection law doesn't have to do it all.

Some problems may not be amenable to law at all. Paul Nemitzposed this question: given that machine learning training data is always historical, and that therefore the machines are always perforce backward-looking, how do we as humans retain the drive to improve if we leave all our decisions to machines? No data protection law in the world can solve that.

Illustrations: The CPDP 2019 welcome sign in Brussels.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.