« Update needed | Main | Creepiness at scale »

Level two

Tesla-crash-NYTimes-370.pngThis week provided two examples of the dangers of believing too much hype about modern-day automated systems and therefore overestimating what they can do.

The first is relatively minor: Google employee Blake Lemoine published his chats with a bot called LaMDA and concluded it was sentient "basd on my religious beliefs". Google put Lemoine on leave and the press ran numerous (many silly) stories. Veterans shrugged and muttered, "ELIZA, 1966".

The second, however...

On Wednesday, the US National Highway Traffic Safety Administration released a report (PDF) studying crashes involving cars under the control of "driver-assist" technologies. Out of 367 such crashes in the nine months after NHTSA began collecting data in July 2021, 273 involved Teslas being piloted by either "full self-driving software" or its precursor, "Tesla Autopilot".

There are important caveats, which NTHSA clearly states. Many contextual details are missing, such as how many of each manufacturer's cars are on the road and the number of miles they've traveled. Some reports may be duplicates; others may be incomplete (private vehicle owners may not file a report) or unverified. Circumstances such as surface and weather conditions, or whether passengers were wearing seat belts, are missing. Manufacturers differ in the type and quantity of crash data they collect. Reports may be unclear about whether the car was equipped with SAE Level 2 Advanced Driver Assistance Systems (ADAS) or SAE Levels 3-5 Automated Driving Systems (ADS). Therefore, NTHSA says, "The Summary Incident Report Data should not be assumed to be statistically representative of all crashes." Still, the Tesla number stands out, far ahead of Honda's 90, which itself is far ahead of the other manufacturers listed.

SAE, ADAS, and ADS refer to the system of levels devised by the Society of Automotive Engineers (now SAE International) in 2016. Level 0 is no automation at all; Level 1 is today's modest semi-automated assistance such as cruise control, lane-keeping, and automatic emergency braking. Level 2, "partial automation", is now: semi-automated steering and speed systems, road edge detection, and emergency braking.

Tesla's Autopilot is SAE Level 2. Level 3 - which may someday include Tesla's Full Self Drive Capability - is where drivers may legitimately begin to focus on things other than the road. In Level 4, most primary driving functions will be automated, and the driver will be off-duty most of the time. Level 5 will be full automation, and the car will likely not even have human-manipulable controls.

Right now, in 2022, we don't even have Level 3, though Tesla CEO Elon Musk keeps promising we're on the verge of it with his company's Full Self-Drive Capability, its arrival always seems to be one to two years away. As long ago as 2015, Musk was promising Teslas would be able to drive themselves while you slept "within three years"; in 2020 he estimated "next year" - and he said it again a month ago. In reality, it's long been clear that cars autonomous enough for humans to check out while on the road are further away than they seemed five years ago, as British transport commentator Christian Wolmar accurately predicted in 2018.

Many warned that Levels 2 and 3 are would be dangerous. The main issue, pointed out by psychologists and behavorial scientists, is that humans get bored watching a computer do stuff. In an emergency, where the car needs the human to take over quickly, said human, whose attention has been elsewhere, will not be ready. In this context it's hard to know how to interpret the weird detail in the NTHSA report that in 16 cases Autopilot disengaged less than a second before the crash.

The NHTSA news comes just a few weeks after a New York Times TV documentary investigation examining a series of Tesla crashes. Some it links to the difficulty of designing software that can distinguish objects across the road - that is, the difference between a truck crossing the road and a bridge. In others, such as the 2018 crash in Mountain View, California, the NTSB found a number of contributing factors, including driver distraction and overconfidence in the technology - "automation complacence", as Robert L. Sumwalt calls it politely.

This should be no surprise. In his 2019 book, Ludicrous, auto industry analyst Edward Niedermeyer mercilessly lays out the gap between the rigorous discipline embraced by the motor industry so it can turn out millions of cars at relatively low margins with very few defects and the manufacturing conditions Niedermeyer observes at Tesla. The high-end, high-performance niche sports cars Tesla began with were, in Niedermeyer's view, perfectly suited to the company's disdain for established industry practice - but not to meeting the demands of a mass market, where affordability and reliability are crucial. In line with Nidermeyer's observations, Bloomberg Intelligence predicts that Volkswagen will take over the lead in electric vehicles by 2024. Niedermeyer argues that because it's not suited to the discipline required to serve the mass market, Tesla's survival as a company depends on these repeated promises of full autonomy. Musk himself even said recently that the company is "worth basically zero" if it can't solve self-driving.

So: financial self-interest meets the danger zone of Level 2 with perceptions of Level 4. I can't imagine anything more dangerous.

Illustrations: One of the Tesla crashes investigated in New York Times Presents.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard - or follow on Twitter.


TrackBack URL for this entry:

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)