It’s difficult to teach a machine to react correctly when faced with new or unpredictable situations we frequently encounter while driving. Heaps of engineering effort has gone into cracking this problem. But how do we determine when a vehicle is safe enough?
In order to be 95% certain that autonomous vehicles match the safety of human drivers, the cars would need to log 275 million failure-free autonomous miles, according to a report from the Rand Corp. And to prove that autonomous vehicles are even just 10% or 20% safer than humans, the requirement jumps to billions of miles. Since 2009, autonomous tech company Waymo’s vehicles have driven a little more than 20 million miles.
Either manufacturers must spend potentially decades testing small fleets or the public will wind up taking part in the process of testing them. The latter is only really acceptable if infrastructure is in place to ensure the safety of drivers and pedestrians.
CITY AND VEHICLE INFRASTRUCTURE ISN’T IN PLACE
Expecting individual autonomous vehicles to operate independently is a recipe for disaster. Each vehicle would have to guess what all the others are doing. Each would rely only on its own limited view of the world, with sensors and cameras that can fail or be obstructed by poor weather or road debris.
Enabling vehicles to communicate with one another reduces the possibility of unpleasant surprises and allows vehicles to make communal decisions to maintain speed and safety. Some cars already have the capability to perform such communication, but there are no rules in place to guarantee cars from different manufacturers will be able to communicate with one another.
Infrastructure specific to autonomous vehicles such as smart traffic lights and camera systems could alert vehicles about pedestrians, cyclists and dangerous road conditions and help prevent accidents. Unfortunately, it has yet to be determined who would pay for the necessary infrastructure upgrades and whether Americans would be willing to accept more surveillance on their roads.
UNCLEAR ON WHO IS LIABLE WHEN AN ACCIDENT HAPPENS
As long as self-driving features require the driver to be ready to take control, the driver will remain liable for any accidents. Car manufacturers are only liable if there’s a fault in their vehicle. But what happens if an autonomous passenger car causes an accident? Is the manufacturer liable because it designed the system that’s at fault?
Some states are trying to address the question. Florida passed a law saying that the person who initiates a trip in an autonomous vehicle is considered the operator, and while the law doesn’t explicitly establish liability, it is laying a foundation for how liability may be addressed. But the process is piecemeal, and so far existing laws haven’t faced serious challenges in court.
Until there are laws in place to protect them, it’s unlikely automakers will accept the risk of allowing drivers to use self-driving features without requiring them to remain ready to take control of the vehicle.
SELF-DRIVING FLEETS WILL PAVE THE WAY
In the near future, the first autonomous vehicles will likely be taxis and cargo trucks. Both industries have remained bullish on autonomy for several reasons.
First, fleet companies can reduce overall employee counts without totally eliminating the human component. Waymo already has a system in place in which human supervisors can intercede to course correct an autonomous vehicle when it encounters a situation where it’s incapable of deciding what to do next.
Second, taxi and trucking companies are already open to more liability for their drivers' mistakes. The money they save on payroll should make up for the liability risks if they believe autonomous systems are about as safe as the humans they replace.
You probably won’t be able to buy an autonomous car any time soon. But expect autonomous fleet services to begin expanding in the near future.
This story was provided to The Associated Press by the automotive website Edmunds. Will Kaufman is a content strategist at Edmunds. Instagram: @didntreadthestyleguide.