Drones are great. They can whizz above the traffic, delivering our mail, groceries and other vital stuff, within minutes. They can keep us safe through unobtrusive surveillance, and they can bring broadband to places where broadband cannot reach.
Drones can also be subverted. Add AI (the real kind) to drones and you start a potential escalation of ways of killing people and blowing things up, without the nastiness of having to be there.
Drones, therefore, need to be added to the list of technologies (such as 5G, AI and VR) that will be fully ready for action in the five-to-ten year window, not the one-to-three year window.
Drones crash into things. Although the video of a drone clipping the wing of an SWA airplane has now been outed as fake, the discussions about how close to airports, celebrities and other monuments of national importance they should be allowed are ongoing.
For one thing, there are far too many of them, and they don’t have roads to stick to – yet. So regulators are now involved, and nothing flies without everyone being as happy as they can be that airspace is safe.
And some of them do crash. This one in Russia crashed into a wall during its debut delivery. The going theory is that the crash may have been caused by radio interference from too many local Wi-Fi signals.
Meanwhile, some respected observers of our world believe that drones are the most immediate danger to mankind. As we noted in our last Friday Futures, Michio Kaku believes that drones – or automatic killing machines – are our current biggest threat. Once a drone has been given permission to kill a human being, he believes, what happens when it sees the next human being and believes that it has permission to kill him/her, too. And the next. And so on.
A little far fetched perhaps?
Maybe so, but history is littered with humans making mistakes with instructions (either for other human beings or machines) with horrifying consequences. And if a drone can be discombobulated by Wi-Fi, it sure as heck can be hacked.
So, reluctantly, while regulators try to make airspace safe, and while we work out how to tell drones not to kill every human being they meet, and while Russians figure out how to avoid walls, we have to put drones in the pile of technologies whose advancement needs to be paused while we make sure they are safe.
And actually work.