Main

November 25, 2022

Assume a spherical cow

SphericalCow-IngridKallick-370.jpgThe early months of 2020 were a time of radical uncertainty - that is, decisions had to be made that affected the lives of whole populations where little guidance was available. As Leonard Smith and David Tuckett explained at their 2018 conference on the subject (and a recent Royal Society scientific meeting) decisions under radical uncertainty are often one-offs whose lessons can't inform the future. Tuckett's and Smith's goal was to understand the decision-making process itself in the hope that this part of the equation at least could be reused and improved.

Inevitably, the discussion landed on mathematical models, which attempt to provide tools to answer the question, "What if?" This question is the bedrock of science fiction, but science fiction writers' helpfulness has limits: they don't have to face bereaved people if they get it wrong; they can change reality to serve their sense of fictional truth; and they optimize for the best stories, rather than the best outcomes. Beware.

In the case of covid, humanity had experience in combating pandemics, but not covid, which turned out to be unlike the first known virus family people grabbed for: flu. Imperial College epidemiologist Neil Ferguson became a national figure when it became known that his 2006 influenza model suggesting that inaction could lead to 500,000 deaths had influenced the UK government's delayed decision to impose a national lockdown. Ferguson remains controversial; Scotland's The Ferrett offers a fact check that suggests that many critics failed to understand the difference between projection and prediction and the importance of the caveat "if nothing is done". Models offer possible futures, but not immutable ones.

As Erica Thompson writes in her new book, Escape From Model Land: How Mathematical Models Can Lead Us Astray and What We Can Do About It, models also have limits that we ignore at our peril. Chief among them is the fact that the model is always an abstracted version of reality. If it weren't, our computers couldn't calculate them any more than they can calculate all the real world's variables. Thompson therefore asks: how can we use models effectively in decision making without becoming trapped inside the models' internal worlds, where their simplified assumptions are always true? More important, how can we use models to improve our decision making with respect to the many problems we face that are filled with uncertainties?

The science of covid - or of climate change - is only a small part of the factors a government must weigh in deciding how to respond; what science tells us must be balanced against the economic and social impacts of different approaches. In June 2020, Ferguson estimated that locking down a week earlier would have saved 20,000 lives. At the time, many people had already begun withdrawing from public life. And yet one reason the government delayed was the belief that the population would quickly give in to lockdown fatigue and resist restrictions, rendering an important tool unusable later, when it might be needed even more. This assumption turned out to be largely wrong, as was the assumption in Ferguson's 2006 model that 50% of the population would refuse to comply with voluntary quarantine. Thompson calls this misunderstanding of public reaction a "gigantic failure of the model".

What else is missing? she asks. Ferguson had to resign when he himself was caught breaking the lockdown rules. Would his misplaced belief that the population wouldn't comply have been corrected by a more diverse team?

Thompson began her career with a PhD in physics that led her to examine many models of North Atlantic storms. The work taught her more about the inferences we make from models than about storms, and it opened for her the question of how to use the information models provide without falling into the trap of failing to recognize the difference between the real world and Model Land - that is, the assumption-enclosed internal world of the models.

From that beginning, Thompson works through different aspects of how models work and where their flaws can be found. Like Cathy O'Neil's Weapons of Math Destruction, which illuminated the abuse of automated scoring systems, this is a clearly-written and well thought-out book that makes a complex mathematical subject and accessible to a general audience. Thompson's final chapter, which offers approaches to evaluating models and lists of questions to ask modelers, should be read by everyone in government.

Thompson's focus on the dangers of failing to appreciate the important factors models omit leads her to skepticism about today's "AI", which of course is trained on such models: "It seems to me that rather than AI developing towards the level of human intelligence, we are instead in danger of human intelligence descending to the level of AI by concreting inflexible decision criteria into institutional structures, leaving no room for the human strengths of empathy, compassion, a sense of fairness and so on." Later, she adds, "AI is fragile: it can work wonderfully in Model Land but, by definition, it does not have a relationship with the real world other than one mediated by the models that we endow it with."

In other words, AI works great if you can assume a spherical cow.


Illustrations: The spherical cow that mocks unrealistic scientific models drawn jumping over the moon by Ingrid Kallick for the 1996 meeting of the American Astronomical Association (via Wikimedia).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

September 30, 2022

Regression

Queen_Elizabeth_II's_Funeral_and_Procession_(19.Sep.2022)-370.jpgThey got what they wanted, and now they're screwing it up.

"They" in that sentence, is entertainment industry rights holders, who campaigned for years in bad ways and worse ways to get rid of "piracy" - that is, unauthorized copying and digital distribution of their products. In pursuit of that ideal, they sued popular companies (Napster, MP3.com) out of existence; prosecuted users and demanded ISPs' help in doing so; applied digital rights management to everything from software and classic books to tractors and wheelchairs; and pursued national legislation and trade treaties to entrench their business model.

"What they wanted" was people paying for the cultural artefacts they finance. How they got it, in the end, was not through any of the above efforts. Instead, as many scholars and activists told them during those years it would be, the solution was legally authorized services for which people were willing to pay. And thus grew and flourished video services such as Netflix, YouTube, Hulu, and, latterly, Disney, Amazon Prime, Apple TV, and and music services like Apple iTunes, Spotify, and Amazon Music. The industry began making money from digital downloads. So yay?

You would think. Instead, we're going backwards. The reality now is that paid services are becoming a chore to use: users complain the interfaces are frustrating, and that the thing they want to watch is always on some other service. Newspapers now track where to find popular older shows, and know only a sliver of the mass audience will be able to see some of the new material they review.

Result: pirate sites are back on top You can find almost anything in one search, it's yours to watch any way you want within minutes, and any ads have been neatly excised. Like I said: they got what they wanted and then...

This tiny rant had two immediate provocations. The first was the release of Glyn Moody's new book, Walled Culture (available here as a freely downloadable PDF). The other was two Guardian stories by Jim Waterson about Buckingham Palace's wrangle with the UK's national broadcasters over the footage of the recent state funeral of Queen Elizabeth II. The BBC, ITV, and Channel 4 are allowed future use of juar one hour's worth of clips; for anything else they must ask permission.

This was a state occasion, paid for by taxpayers, held on public streets and in public buildings, and the video recording was made by broadcasters, which are financed by univeral license fees (BBC) and their own commercial activities (all of them). It's particularly bonkers because the entirety of the day's footage is readily available on torrent sites. The palace literally cannot control the footage as it could at the 1953 coronation - though it can limit broadcast. Waterson also reveals that behind the scenes during the various services palace staff and broadcasters shared a WhatsApp group in which the staffers sent a message every five minutes to approve or refuse the use of the previous video block. In our world of 2022, this power to micromanage how they are seen is more power than most people think the monarchy has. The palace is also claiming the right to veto the use of footage of the new monarch's ascension service. This is the rawest form of copyright as entrenched power.

In Walled Culture, Moody recounts the Internet's three decades of copyright wrangles, and the resulting shrinkage of public access to culture. It's a great romp through a legal regime that, as Jessica Litman said circa 1998, people would reject if they understood it. Moody begins with the shift from analogue to digital media, then goes through the lawsuits, the battle to make the results of publicly funded research open to the public, web blocking and other censorship, the EU's copyright directive, and the regulatory capture that, as Moody says, leaves impoverished the artists and creators copyright law was originally designed to benefit.

My favorite chapter, however, is the one on copyright absurdities. Half of the commercial movies ever made are unavailable to view. Because of the way streaming is licensed, Netflix 2022 has a library perhaps a tenth the size of Netflix 2012 - or 2002, when the rental service's copy of a DVD could not be withdrawn. Yet digital media have a notoriously short life before they must be migrated to newer media and formats. Copyright is even why statisticians continue to use suboptimal statistical analysis because in the 1920s Kendall Pearson refused fellow statistician Ronald A. Fisher permission to use his statistical tables.

As Moody shows, the impact of copyright law is widely felt, and its abuse even more so. Bear in mind that the original purpose was to balance the public interest (as opposed to the public's interest) in its own culture against the desirability of encouraging creators and artists to go on creating new works by giving them a relatively brief period of exclusivity in which to exploit their work. For that reason, a world in which piracy is the best option for accessing culture is not a good world. Moody' proposes numerous fixes that roll back the worst elements and change the power imbalance. We do want to pay artists and creators, especially those whose voices have largely gone unheard in the past. Rights holders should not be - ahem - kings.


Illustrations: Queen Elizabeth II's funeral procession (via Wikimedia).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

April 1, 2022

Grounded

Boeing-737-MAX.png"The airline probably needed to do a better job to make sure its pilots understood exactly what to do in case the aircraft was performing in a unique, unusual way, and how to get out of the problem," former National Transportation Safety Board chair Mark Rosenker tells CBS News in the recent documentary Downfall: The Case Against Boeing (directed by Rory Kennedy, written by Mark Bailey and Keven McAlester, and streaming on Netflix). He then downplays the risk to passengers: "Certainly in the United States they understand how to operate this aircraft."

Rosenker was speaking soon after the 2018 Lion Air crash.

Three oh-my-god wrong things here: the smug assumption that *of course* American personnel are more competent than their Indonesian counterparts (see also contemporaneous articles dissing Indonesia's airline safety record); the presumption that a Boeing aircraft is safe and the crash a non-recurring phenomenon; and the logical sequitur that it must be the pilot's fault. All that went largely unchallenged until the Ethiopian Airlines crash, 19 weeks later. Even then, numerous countries grounded the plane before the US finally followed suit - and even *then* it was ordered by the president, not the Federal Aviation Authority. The FAA's regulatory failure needs its own movie.

As we all now know, a faulty attack sensor sent bad data to the aircraft's Maneuvering Characteristics Augmentation System, software intended to stabilize the plane. The pilot did his best in an impossible situation. Even after that became clear, Boeing still blamed the crew for not turning off MCAS. The reason: Boeing didn't tell them it was there. In Congressional testimony, the hero of the Hudson, Captain Sully Sullenberger, summed it up thusly: "We shouldn't expect pilots to have to compensate for flawed designs."

This blame game was a betrayal. One reason aviation is so safe is that all sides have understood that every crash damages everyone. The industry therefore embraced extensive cross-collaboration in which everyone is open about the causes of failures and shares solutions. Blame destroys that culture.

All of this could be a worked example in Jessie Singer's recent book There Are No Accidents: The Deadly Rise of Injury and Disaster - Who Profits and Who Pays the Price. Of course unintended injuries happen, but calling them "accidents" removes culpability and stops us from thinking too much about larger causes. "Accident" means: "nothing to see here".

With the 737 MAX, as press articles suggested at the time and the documentary shows, that larger cause was the demise of Boeing's pride-of-America safety-first engineering culture, which rewarded employees for notifying problems. The rot began in 1997, when a merger meant new bosses from McDonnell Douglas executives arrived, and, former quality manager John Barnett tells the camera, "Everything you've learned for 30 years is now wrong." Value for shareholders replaced safety-first. Employees were thinned. Planes were made of cheaper materials. Headquarters left Seattle, where engineering was based, for Chicago. The culture of safety gave way to a culture of concealment.

Aviation learned early the importance of ergonomic design to avoid pilot error. This is where the documentary is damning: Boeing's own emails show the company knew pilots needed training for MCAS and never provided it, even when directly asked - by Lion Air itself, in 2017. Boeing executives mocked them for asking, even though its own risk assessments predicted a 737 MAX crash every fifteen years. Boeing bet it could fix, test, and implement MCAS before it caused more trouble. It was wrong.

A fully-loaded plane crash makes headlines and sparks protests and Congressional investigations. Most of the "accidents" Singer writes about, however - traffic crashes, house fires, falls, drownings, and the nearly 840,000 opioid deaths classed as "unintentional injury by drug poisoning" since 1999 (see also Alex Gibney's Crime of the Century) - near-invisibly kill in a statistical trickle. One such was her best friend, killed when a car hit his bike. All these are "accidents" caused by human error. But even with undercounts of everything from shootings to medical errors, the "accidents" were the third leading cause of death in the US in 2019, behind heart disease and "malignant neoplasms" (cancer), ahead of cerebrovascular disease, chronic lower respiratory disease, Alzheimers, and diabetes. We research all those *and( covid-19, which was number three in 2020. Why not "accidents"? (Note: this all skews American; other wealthy countries are safer.)

Singer's argument resonates because during my ten years as the in-house writer for RISCS, then-director Angela Sasse argued repeatedly that users will do the secure thing if it's the easiest path to follow, and "user errors" are often failed security policies. Sometimes, fixes seem tangential, such as lessening worker stress by hiring more staff, updating computer systems, or ensuring better work-life balance, which may improve security because tired, stressed workers make more mistakes.

Singer argues that the human errors that cause "accidents" are predictable and preventable, and surviving them is a "marker of privilege". Across the US, she finds poverty correlated with "accidental" death and wealth with safety. The pandemic made this explicit. But Singer reminds that the same forces frame people crossing the street as "jaywalkers" and blame workers killed on factory lines for not following posted rules. Each time the less powerful is framed as the cause of their own demise. And so it required that second 737 MAX crash and 157 more deaths to ground that plane.


Illustrations: The Boeing 737 MAX (Boeing).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.