There's a number that is all the buzz these days in the world of autonomous vehicles: 94 percent. That's the percentage of car crashes caused by driver error. The implication: Once cars become driverless, there will be hardly any crashes at all.

Not so.

As fully autonomous cars become a reality, there will be collisions between them, and there will be collisions with conventional cars. Fewer than before? Yes, but there will be crashes nevertheless.

The speed with which driverless cars are accepted by the public may well be determined by this question: How safe is safe enough?

"Is 'safe enough' 10 percent safer than where we are by manually driving?" said Bryan Reimer, associate director of the transportation center at Massachusetts Institute of Technology. "One thing that I think is really going to limit our ability to see this technology proliferate is a societal acceptance on the definition of what is safe enough."

Without a generally accepted definition of what constitutes a sufficiently safe driverless car, the industry could suffer from setbacks caused by a combination of public apprehension and news headlines.

When a truck backed into a self-driving bus in Las Vegas last month, one headline mirrored many others: "Las Vegas' self-driving bus crashes in first hour of service."

When a driver using Tesla's autopilot died in a Florida crash last year, a British magazine used the headline "Tesla driver dies in first fatal autonomous car crash in U.S." and a later headline in Scientific American lamented "Deadly Tesla Crash Exposes Confusion over Automated Driving."

"I think it's important to clear up a possible misconception," said Robert Sumwalt, chairman of the National Transportation Safety Board, after his agency completed its Tesla crash investigation. "The automobile involved in the collision was not a self-driving car."

The Tesla's "Autopilot" system is designed to perform limited tasks and functioned as designed, the investigation concluded. However, the system is meant to augment, not replace the driver. The vehicle's human driver and his overreliance on the system were at fault, the NTSB concluded.

Surveys show people harbor trepidation about giving up driving, and an AAA poll found that 54 percent of drivers feel less safe even sharing the road with fully autonomous cars.

Reimer says, "If consumers are looking at robots as being almost perfect, they can't be."

They can't be and they shouldn't be, a Rand Corporation report says. "Requiring autonomous vehicles to be nearly flawless before putting them on the road could cost hundreds of thousands of lives," the report says.

According to the National Safety Council, 40,000 people were killed in crashes last year, or about 110 each day. More than 1 million have died since 1990. With the National Highway Traffic Safety Administration (NHTSA) concluding that "94 percent of serious crashes are due to dangerous choices or errors people make behind the wheel," the stage would seem set for fully autonomous cars, which, among other things, constantly see 360 degrees around them.

Except for the politics.

"Unless we have defined how safe is safe enough - and we are in agreement - the nature of politics is that fingers will point at each other," Reimer said. "If you think about Congress now, everybody's going to look at everybody else and say 'I don't want to be responsible.' You cause mass pandemonium. That sets the potential viability of these technologies back by decades."

The U.S. Department of Transportation and Congress are seeking to balance the desire of automakers to develop driverless cars without burdensome restrictions with the need for regulatory guidance. Since several companies have test vehicles on the road, that effort at balance becomes even more precarious as states step in to fill what they fear is a growing federal vacuum.

"Given all the ways in which autonomous vehicles could affect society, it's possible that regulators, lawmakers and the industry collectively may not be able to deal with all these issues before autonomous vehicles become road-worthy," said David Groves, a researcher who co-authored the Rand report.

Groves and Nidhi Kalra developed a model to estimate how many lives would be lost or saved in the coming decades under various scenarios of driverless car deployment.

"How important is it that autonomous vehicles are safe when they're introduced versus how quickly they improve?" Kalra said. "Do we allow them on the road when they're like teenage drivers or do we wait for them to be as good as professional drivers?"

They present two of the some 500 different scenarios in their paper, along with a template that allows people to enter various elements to create their own model.

In one model, autonomous cars are marketed in 2020 when they are 10 percent safer than the average human. With steady improvement over 15 years - the result of widespread use by the public - they reach the point of being 90 percent safer than humans. By 2070, the Groves-Kalra report estimates, autonomous cars will have saved 1.1 million lives.

In another scenario, introduction of driverless cars is delayed until 2040 because regulations require them to be close to perfect. By 2070, autonomous cars will have saved 580,000 lives, 420,000 fewer than under the first scenario.

"Waiting for the cars to perform flawlessly is a clear example of the perfect being the enemy of the good," Kalra said.

One measure of the degree to which autonomous cars might help reduce crashes comes from NHTSA's traffic fatality data, which shows:

--10,497 annual deaths attributed to drunk driving.

--10,111 deaths from speeding.

--3,450 deaths from distracted driving.

--803 deaths from dozing behind the wheel.

"We've all heard the argument that autonomous vehicles are never drunk, distracted or tired, so they could reduce the huge number of crashes involving these factors," Groves said.

As Reimer sees it, "If the goal is to be able to say 'We need to be able to justify these vehicles are 25 percent safer than legacy systems today before they are market viable at any number great than X,' hey, that's realistic. But how are we going to prove it?

"The likelihood of seeing a crash with an automated vehicle is very, very high. The likelihood of someone dying at the expense of a highly automated vehicle is for certain," Reimer said.

When that moment arises, the question of how safe is safe enough will be addressed once more, this time underscored by a tragedy.

"A major backlash against a crash caused by even a relatively safe autonomous vehicle could grind the industry to a halt," Kalra said, "resulting in potentially the greatest loss of life over time. The right answer is probably somewhere in between introducing cars that are just better than average and waiting for them to be nearly perfect."

0
0
0
0
0