A fatal accident involving a Tesla Model S in Autopilot mode has hit the headlines around the world. Does this undermine the case for autonomous vehicles? We don’t think so, although it does highlight concerns that we have expressed before.
The circumstances of the accident seem unusual. Initial reports suggest that a truck was crossing a multi-lane highway perpendicular to oncoming traffic. The truck was white, and it is thought that the car’s technology, as well as the driver, failed to spot it against a bright sky.
The technology is still developing
Tesla has expressed profound regret, and says that
“Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert”.
We have argued that legislators should grasp the nettle and set an agreed standard for the passive safety features for autonomous vehicles to reach before further autonomous driving features are released to the public.
This is particularly important for autonomous critical event control – the control needed when an emergency arises.
Tesla’s autonomous system is still safe
Tesla explains that
“this is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles.”
In the real world, these systems cannot and will not be perfect. But neither are human drivers. And evidence suggests that these systems will be safer than human drivers. Data from near misses and collisions can be collated to feed into future improvements in a way that is not possible with individual human drivers.
Relying on a supervising human can be unrealistic
Tesla points out that the Autopilot feature is switched off by default and must be enabled by drivers, who acknowledge that it is a feature in public beta phase. Drivers are notified that they must maintain control and responsibility for the vehicle, keeping their hands on the wheel even while the system is operational.
In our view requiring drivers to remain alert in this way is dangerous and inappropriate. Although drivers may be given warnings about their ultimate responsibility for the vehicle, as the technology improves drivers will see their role diminishing, and the temptation to do something else will be too strong for most to resist. Expecting drivers to maintain the level of alertness needed to take action in an emergency is wholly and absolutely unrealistic.
The slide to full autonomy is, we feel, the most dangerous approach to take.
No need to change direction
US regulator the National Highway Transportation Safety Administration (NHTSA) is investigating the crash and will report its findings in due course, but meanwhile this tragic incident is not, by itself, a reason for autonomous vehicle developers to change direction.