« Power plays | Main | Consent spam »

The user in charge

Thumbnail image for Wilcox, Dominic - Stained Glass car.jpgLast week, we learned that back last October prosecutors in California filed two charges of vehicular manslaughter against the driver of a Tesla on Autopilot who ran a red light in 2019, hit another car, and killed two people.

As they say, we've had this date since the beginning.

Meanwhile, in the UK the Law Commission is trying to get ahead of the market by releasing a set of proposals covering liability for automated driving. Essentially, its report concludes that when automation is driving a car liability for accidents and dangerous driving should shift to the "Authorized Self-Driving Entity" - that is, the company that was granted the authorization to operate on public roads . Perfectly logical; what's jarring is the report's linguistic shift that turns "drivers" into "users", with all the loss of agency that implies. Of course, that's always been the point, particularly for those who contend that automated driving will (eventually) be far safer. Still.

Ever since the first projects developing self-driving cars appeared, there have been questions about how liability would be assigned. The fantasy promulgated in science fiction and Silicon Valley is that someday cars will reach Level 5 automation, in which human intervention no longer exists. Ten years ago, when many people thought it was just a few years away, there were serious discussions about whether these future cars should have steering wheels when they could have bedroom-style acommodation instead. You also heard a lot about improving accessibility and inclusion for those currently barred from driving because of visual impairment, childhood, or blood alcohol level. And other benefits were mooted: less congestion, less pollution, better use of resources through sharing. And think, Clive Thompson wrote at Mother Jones in 2016, of all the urban parking space we could reclaim for cyclists, pedestrians, and parks.

In 2018, Christian Wolmar argued that a) driverless cars were far away, and b) that they won't deliver the benefits hypsters were predicting. Last year, he added that self-driving cars won;t solve the problems people care about, like congestion and pollution, drivers will become deskilled, and shared use is not going to be a winning argument. I agree with most of this. For example, if we take out all the parking, then aren't you going to increase congestion as they ferry themselves back home to wait for the end of the day after dropping off their owners?

So far, Wolmar appears to have been right. Several of the best-known initiatives have either closed down or been sold, and the big trend is consolidation into the hands of large companies that can afford to invest and wait. Full automation seems as far away as ever.

Instead, we are mired in what everyone eventually agreed would be the most dangerous period in the shift to automated driving: the years or decades of partial and inconsistent automation. As the Tesla crash shows, humans overestimating their cars' capabilities is one problem. A second is the predictability gap between humans and AIs. As humans ourselves, we're pretty good at guessing how other human drivers will likely behave. We all tend to try to put distance between ourselves and cars with lots of dents and damage or cars exhibiting erratic behavior, and pedestrians learn young to estimate the speed at which a car is approaching in order to decide whether it's safe to cross the street. We do not have the same insight into how a self-driving car is programmed to behave - and we are not appear predictable to its systems. One bit of complexity I imagine will increasingly matter is that the car's sensors will react to differences we can't perceive.

At the 2016 We Robot conference, Madeleine Clare Elish introduced the idea moral crumple zones. In a hybrid AI-human system, she argued, the blame when anything goes wrong will be assigned to the human element. The Tesla autopilot crash we began with is a perfect example, and inevitable under current US law: the US National Highway Traffic Safety Administration holds that the human in charge of the car is always responsible. Since a 2018 crash, Tesla has reportedly tried to make it clearer to customers that even its most sophisticated cars cannot drive themselves, and, according to the Associated Press, updated its software to make it "harder for drivers to abuse it".

Pause for bafflement. What does "abuse" mean in that sentence? That a driver expects something called "Autopilot" to...drive the car? It doesn't help the accuracy of people's perceptions of their car's capabilities that in December Tesla decided to add a gaming console to its in-car display. Following an announcement by the US National Highway Traffic Safety Administration that it would investigate, Tesla is updating the software so that the gaming feature locks when the car is moving. Shouldn't the distraction potential have been obvious? That's Theranos-level recklessness.

This is where the Law Commission's report makes a lot of sense. It pins the liability squarely on the ASDE for things like misleading marketing, and it sets requirements for handling transitions to human drivers, the difficulty of which was so elegantly explored in Dexter Palmer's Version Control. The user in charge is still responsible for things like insurance and getting kids into seatbelts. The proposals will now be considered by the UK's national governments.


Illustrations: Dominic Wilcox's concept driverless car.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.

TrackBack

TrackBack URL for this entry:
https://WWW.pelicancrossing.net/cgi-sys/cgiwrap/wendyg/managed-mt/mt-tb.cgi/1043

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)

Archives