Driverless cars have an Achilles Heel – the humble road sign

driverless cars
Image credit | Unknown

Confusing driverless cars is becoming a sport. Even before the high-profile case of a 35 mile an hour sign confusing a Tesla, there was real concern that the humble road sign could be a genuine and potentially dangerous ‘Achilles Heel’ in the world of driverless cars.

On that occasion, the altered 3 caused a (now out of date) system on board the Tesla to read the road sign as 85 miles an hour, not 35, which is obviously very dangerous.

The real problem was that it fooled a deep learning system completely and fooled no humans at all.

There was already work going on in this area, of course. It is, after all, vital that driverless cars read road signs accurately – all the time.

‘This is a nightmare’ said Tadayoshi Kohno, Professor in the Department of Computer Science and Engineering, at the University of Washington, when his team managed to fool a machine learning system making driverless cars mistake a ‘STOP’ sign for a 45 mph sign.

Again, it is very obvious to humans, apparently less so to machines.

These examples show just how important it is to be 100% reliable when it comes to deep learning systems.

The problem is that 100% is extremely hard to achieve and the last few percentage points are always the hardest.

Remember when OCR systems were the ‘next big thing’? Every year, the accuracy of the systems would improve. We started with 85% (excited), then 90% (very excited), then 95% if you were lucky and had a following wind. All of which sounded great, and the end of clacking away on an old fashioned keyboard was in sight. The trouble is that when you have scanned a document containing 5,000 words, 500 of them would be wrong. Not so great.

The same goes for accuracy of deep learning AI systems and driverless cars. Even 99% accuracy is simply not good enough if it threatens the physical safety of people.

One day we will get there. One day, driverless cars will be commonplace. But before they can hit the mainstream, regulators and safety experts will require more and more tests. And every time, those last few percentage points will feel more and more difficult.

The other problem is that a small group of humans will always want to thwart systems and play pranks on them.

The danger is that when it comes to driverless cars, the stakes are just too high.

Related: autonomous cars need safety standards (and alert humans): experts

Be the first to comment

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.