Google, Tesla, Ford, Volvo, Uber and Apple have all embraced automated vehicles technology and fully automated cars should be hitting the road in the next decade.
Self-driving cars are an extraordinary revolution, likely to drastically reduce the number of road fatalities. However, the introduction of automated cars raises serious legal concerns and calls for regulatory changes, ranging from liability and insurance to wider ethical considerations. The problem is, artificial intelligence is developing faster than regulation. The longer legislation takes, the fewer lives will be saved. To foster technological progress and improve public safety, regulators need to settle a legal framework as quickly as possible.
Liability – who will pay?
Insurance and liability are crucial issues that’ll need to be tackled before allowing driverless cars on the roads.
- What happens if a self-driving car and a pedestrian collide?
- Which of the car maker, the insurer or the car owner should be held liable?
The UK is one of the first European countries to have started addressing these issues, with the announcement of a Modern Transport Bill to support automated vehicles technology. Among the reform proposals, the government intends to:
- Remove regulatory barriers to boost automated vehicles technology; and
- Extend motor vehicle insurance to protect victims in case of accidents caused by automated mode.
Basically, should an accident occur with an automated vehicle (eg in the case of system failure), the car insurer pays for it. This would allow the victim of a collision to be compensated quickly and to avoid time-consuming and costly proceedings with the car maker. The insurer would then be able to ask for compensation from the car manufacturer under existing product liability laws.
How do we set safety standards?
The introduction of driverless cars also raises ethical considerations regarding safety standards and critical event control.
What is critical event control?
Critical event control is the ability to take decisive action when facing an emergency situation, for instance when a pedestrian suddenly runs into the road or when a car pulls out of a side junction.
Critical event control will be completely handed over to the vehicle, without the need for a driver to use the brake pedal or the steering wheel. But this means difficult choices will have to be made.
Suppose a driverless car must hit either a pedestrian, or steer in such a way that it crashes and harms its passengers.
- How should the system be programmed?
- How could a robot make the ‘’right’’ decision when it has to choose between colliding with another vehicle or running over a pedestrian?
A research study led by the Massachusetts Institute of Technology (MIT) revealed that most people favour the option that minimises casualties and think it is more moral to harm one passenger rather than hit 10 pedestrians. But the same respondents also said they would be less likely to use or buy a vehicle programmed in such a way.
The thing is, people have the ability to tolerate accidents caused by human negligence, because human mistakes happen. But things are different when it comes to robots. Are we willing to be as tolerant with a robot as we are with human beings? Are people likely to accept the fact that their child might be hit by a self-driving car, or buy a machine that would sacrifice its own passengers to save 10 pedestrians?
Facing this dilemma will be a difficult challenge for regulators and as long as they refuse to let artificial intelligence make safety-related choices, it will be the driver’s responsibility to take back control of the vehicle in critical situations.
Driverless cars makers such as Google and Ford are pushing for regulations that allow fully self-driving cars. They believe relying on driver control in the event of an emergency is too dangerous, as the driver’s concentration, reaction time and driving skills may be affected by the automatic mode.
So far, the State of Michigan is the only state to have enacted a legislation which allows the testing and use of self-driving cars with no human driver inside, paving the way to the introduction of entirely autonomous vehicles on the roads.
Will other legislation follow?
Latest posts by Louise Hennon (see all)
- What is it like to be a paralegal at Rocket Lawyer? - 27/06/2017
- Anti-Money Laundering Directive: How will it affect your business? - 31/05/2017
- What does Henry VIII have to do with Brexit? - 11/05/2017