Let’s not pretend self-driving car risks are unique to Uber

self-driving car error
Image credit: Sangoiri / Shutterstock.com

The safety record for self-driving cars is actually better than the Uber accident makes it look, but it’s risky for rival firms to state it could never happen to them.

ITEM: We don’t yet know what went wrong with the Uber self-driving car that accidentally killed a pedestrian in Tempe, Arizona. But that hasn’t stopped rival autonomous-vehicle tech firms from pointing out the accident would not have happened with their technology.

The accident is still under investigation, but Waymo CEO John Krafcik has stated that whatever happened in Tempe, it wouldn’t have happened with Waymo vehicles, reports Reuters:

“At Waymo, we have a lot of confidence that our technology would be able to handle a situation like that,” Krafcik said, referring to a scenario in which a pedestrian crosses the street at night.

Meanwhile, Amnon Shashua, chief executive officer of Intel-owned Mobileye, was equally adamant that the company’s advanced driver assistance systems (ADAS) technology – which uses cameras and sensors for safety features like staying in your lane and automatic emergency braking even in regular human-driven cars – would have spotted Elaine Herzberg in time. He even demonstrated this using video from the accident and feeding it into Mobileye’s software.

If you read carefully, you’ll note that both Krafcik and Shashua are qualifying their statements – they’re not saying their cars would never hit pedestrians ever. They’re saying their technology would not have failed in this very specific scenario. It’s an important distinction, because only a fool would stand up and say their technology is 100% safe and an accident is impossible.

That said, as a general rule, it’s not wise from a PR standpoint to point to the failure of a competitor’s technology and proclaim, “That would never happen with our technology.” Such claims sometimes have a way of haunting you when you’re proven wrong. And it’s not like Waymo’s self-driving accident record is spotless (though of course no one has been hurt).

To be fair, I’m sure the point of both statements wasn’t just to tout their own technological prowess and commitment to safety over the competition. It was also an attempt to reassure the general public (and, perhaps more importantly, investors, regulators and legislators) that autonomous cars really can be safer than regular cars – provided you do it right. Consequently, we shouldn’t condemn an entire fledgling industry because of one tragic fatality involving one company – especially when that one company doesn’t have the greatest of reputations for playing by the rules, and who reportedly cut corners on its lidar sensing technology in 2016, which may turn out to have been a factor in the accident.

Fair point. It’s worth noting that while there have been numerous accidents involving self-driving cars, they’re statistically rare, especially when you compare the incident numbers to actual miles driven (which the state of California tracks in its regular reports on autonomous driving). And in the majority of cases, the accident is the fault of humans – either the ones in the test vehicles, or other drivers on the road.

Even so, the development of self-driving cars is by necessity a risky one – accidents happen, and it was only a matter of time before someone got hurt or even killed. That’s no reason to give up on autonomous cars and stick with the status quo of dangerous humans at the wheel. But to be brutally honest (and I would love to be proven wrong here) the fatality in Tempe isn’t likely to be the last.

Granted, it’s probably an even worse PR move for Waymo, Mobileye or any other stakeholder to say that than it is to say their technology is totally risk-free. The problem with the latter statement is that it could potentially undermine public trust in self-driving cars even further if it turns out to be wrong.

And it’s not like there’s no third option for a response. Here is NuTonomy’s response to the Uber accident:

“We are working with City of Boston officials to ensure that our automated vehicle pilots continue to adhere to high standards of safety,” a NuTonomy statement said. […]

Karl Iagnemma, NuTonomy’s CEO, told the Boston Herald that the response to the Uber self-driving car accident, in which a woman was struck by a car while crossing the street, resulting in fatal injuries, will be crucial to the future of the technology.

“The reality is we may work very hard as technology developers and end up with a technology members of the public are uncomfortable with,” Iagnemma said. “If that’s the outcome, then we have failed as an industry, so we have to think very carefully about how we develop, test, and deploy the technology.”

That sounds a lot more reassuring to me than “that wouldn’t happen with our technology”.

Be the first to comment

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.