New rules to let self-driving cars break the speed limit or mount kerbs to avoid accidents are being drawn up in a ‘digital Highway Code’.
And in a radical legal move, it could be the car maker punished if a driverless car speeds without justification or causes a fatal accident.
One key question in an official review launched yesterday is whether automated vehicles should, like human drivers, be allowed to break the rules for a greater good.
Should they be programmed to mount the kerb to avoid a child in the road, let an ambulance go past, or if two cars are stuck in a narrow street?
New rules to let self-driving cars break the speed limit or mount kerbs to avoid accidents are being drawn up in a ‘digital Highway Code’. Stock photo
Or should they be instructed never to swerve on to the pavement to avoid someone in the road because it could ‘endanger innocent passers-by on the pavement, simply to avoid a person who is at fault’.
Other dilemmas include whether a car should be programmed to deliberately run over one pedestrian if that avoids hitting a larger group.
Officials are also asking whether a driverless car should be allowed to edge through pedestrians that block its path. They fear a vehicle may never get anywhere if pedestrians know it will always stop when they walk in front of it.
A public consultation on the major issues is part of a three-year project to get driverless cars on Britain’s roads by 2021.
A public consultation on the major issues is part of a three-year project to get driverless cars on Britain’s roads by 2021. Stock photo
This ambitious deadline set by Transport Secretary Chris Grayling has been dismissed as unrealistic by many experts.
A fatal collision in Arizona involving a self-driving Uber vehicle in March has fuelled concerns that the technology is being rushed. It will amount to the biggest shake-up of UK road regulations since the introduction of the Highway Code in 1931.
The report predicts that there will be situations when automated vehicles could be allowed to speed. These could include the need to overtake quickly to avoid a collision, to avoid sharp braking on reaching a lower speed limit, or to keep traffic flowing smoothly.
Another key question is who will be responsible for accidents. New criminal offences could allow car manufacturers to be prosecuted and punished. In extreme cases they could lose their licence to make driverless cars.
An ambitious deadline to get driverless cars on Britain’s roads by 2021 set by Transport Secretary Chris Grayling (pictured) has been dismissed as unrealistic by many experts
The report also proposes that automated vehicles should need permission to travel without a driver on hand to take over in an emergency.
Roads minister Jesse Norman said: ‘With automated driving technology advancing rapidly, it is important that our laws and regulations keep pace so that the UK can remain a world leader in this field.’
AA president Edmund King said: ‘There are still 101 questions. The moral dilemma is whether the car is programmed to always protect the occupants – does the car save the driver or the pedestrian?
‘However it is good that these questions are being asked now before robot technology just decides for itself.’