It was going to happen eventually. Last night, one of Uber’s driverless cars struck and killed a pedestrian in Arizona.
It’s not the first time someone has been killed in an autonomous vehicle accident – a Tesla driver died in 2016 when his autopilot didn’t ‘see’ a white tractor-trailer on a fast road – and indeed it's been reported that the Uber car was not at fault, but it nonetheless marks a turning point.
Amid mounting public alarm and calls for tighter regulation, Uber has paused its testing programme; others may follow.
It raises a potentially existential question about driverless technology – who is liable when a car without a driver kills someone?
This isn’t like Uber and the gang driving through the blurred lines of outmoded taxi regulation. Where public safety is concerned, to the point of these decisions directly influencing whether people live or die, the law cannot just sit by and wait.
As it stands now, liability for motor accidents in the UK arises under the law of tort, explains Antony Morris, partner at Clarkslegal. ‘Drivers owe a duty of care to other road users, and if that duty of care is breached, there is liability. That is backed up by compulsory motor insurance.’
This existing framework simply isn’t designed to handle the complexities of autonomous vehicles.
‘If a fully driverless car has an accident, the fault may result from a number of factors: software error, mechanical failure, hacking – are all possibilities,’ says Morris. ‘What would need to be shown to establish liability in those circumstances? And who would be at fault: the vehicle owner, the manufacturer, the person responsible for maintenance?’
The British government intends to clear the legal, insurance and regulatory barriers for driverless cars, with the aim of making the UK a leader in the technology by 2021. Answering such questions will be a part of that.
Yet it will not find it easy to do so, when public opinion is inflamed by innocent pedestrians mowed down by robot cars put on the market by impossibly rich tech companies that barely pay any tax.
The danger is that we’ll end up in a Catch 22: driverless cars will need to be demonstrably far superior to those with human drivers before they will be approved for use on the roads, yet without real world testing they’ll never ‘learn’ to be better drivers in the first place.
That would be a terrible shame: autonomous vehicles have the potential to drastically cut road deaths, traffic congestion and air pollution. The technology should be pursued, but with great care to protect human life along the way – contrary to appearances, this is not a race.