Tuesday, March 20, 2018

Test Case

So it finally happened: a self-driving car struck and killed a pedestrian in Arizona. And, of course, the car was an Uber.

(Why Uber? Well, Uber is a taxi firm. Lots of urban and suburban short journeys through neighbourhoods where fares cluster. In contrast, once you set aside the hype, Tesla's autopilot is mostly an enhanced version of the existing enhanced cruise control systems that Volvo, BMW, and Mercedes have been playing with for years: lane tracking on highways, adaptive cruise control ... in other words, features used on longer, faster journeys, which are typically driven on roads such as motorways that don't have mixed traffic types.)

There's going to be a legal case, of course, and the insurance corporations will be taking a keen interest because it'll set a precedent and case law is big in the US. Who's at fault: the pedestrian, the supervising human driver behind the steering wheel who didn't stop the car in time, or the software developers? (I will just quote from CNN Tech here: "the car was going approximately 40 mph in a 35 mph zone, according to Tempe Police Detective Lily Duran.")

This case, while tragic, isn't really that interesting. I mean, it's Uber, for Cthulhu's sake (corporate motto: "move fast and break things"). That's going to go down real good in front of a jury. Moreover ... the maximum penalty for vehicular homicide in Arizona is a mere three years in jail, which would be laughable if it wasn't so enraging. (Rob a bank and shoot a guard: get the death penalty. Run the guard over while they're off-shift: max three years.) However, because the culprit in this case is a corporation, the worst outcome they will experience is a fine. The soi-disant "engineers" responsible for the autopilot software experience no direct consequences of moral hazard.

But there are ramifications.

Firstly, it's apparent that the current legal framework privileges corporations over individuals with respect to moral hazard. So I'm going to stick my neck out and predict that there's going to be a lot of lobbying money spent to ensure that this situation continues ... and that in the radiant Randian libertarian future, all self-driving cars will be owned by limited liability shell companies. Their "owners" will merely lease their services, and thus evade liability for any crash when they're not directly operating the controls. Indeed, the cars will probably sue any puny meatsack who has the temerity to vandalize their paint job with a gout of arterial blood, or traumatize their customers by screaming and crunching under their wheels.

Secondly, sooner or later there will be a real test case on the limits of machine competence. I expect to see a question like this show up in an exam for law students in a decade or so:

A child below the age of criminal responsibility plays chicken with a self-driving taxi, is struck, and is injured or killed. Within the jurisdiction of the accident (see below) pedestrians have absolute priority (there is no offense of jaywalking), but it is an offense to obstruct traffic deliberately.

The taxi is owned by a holding company. The right to operate the vehicle, and the taxi license (or medalion, in US usage) are leased by the driver.

The driver is doing badly (predatory pricing competition by the likes of Uber is to blame for this) and is unable to pay for certain advanced features, such as a "gold package" that improves the accuracy of pedestrian/obstacle detection from 90% to 99.9%. Two months ago, because they'd never hit anyone, the driver downgraded from the "gold package" to a less-effective "silver package".

The manufacturer of the vehicle, who has a contract with the holding company for ongoing maintenance, disabled the enhanced pedestrian avoidance feature for which the driver was no longer paying.

The road the child was playing chicken on is a pedestrian route closed to private cars and goods traffic but open to public transport.

In this jurisdiction, private hire cars are classified as private vehicles, but licensed taxis are legally classified as public transport when (and only for the duration) they are collecting or delivering a passenger within the pedestrian area.

At the moment of the impact the taxi has no passenger, but has received a pickup request from a passenger inside the pedestrian zone (beyond the accident location) and is proceeding to that location on autopilot control.

The driver is not physically present in the vehicle at the time of the accident.

The driver is monitoring their vehicle remotely from their phone, using a dash cam and an app provided by the vehicle manufacturer but subject to an EULA that disclaims responsibility and commits the driver to binding arbitration administered by a private tribunal based in Pyongyang acting in accordance with the legal code of the Republic of South Sudan.

Immediately before the accident the dash cam view was obscured by a pop-up message from the taxi despatch app that the driver uses, notifying them of the passenger pickup request. The despatch app is written and supported by a Belgian company and is subject to an EULA that disclaims responsibility and doesn't impose private arbitration but requires any claims to be heard in a Belgian court.

The accident took place in Berwick-upon-Tweed, England; the Taxi despatch firm is based in Edinburgh, Scotland.

Discuss!

by Charlie Stross, Charlie's Diary |  Read more: