We were quite relaxed being rude about AI and in some cases we are happy to apologise. We were rude because of the hype, because everything now seems to be powered by AI.
AI in customer experience? Well, that is bound to improve the customer experience.
AI in fraud prevention (a staggering $42 billion will be invested)? Happy days, problem solved.
We are rude because there are two breeds of AI, both of which are hyped to the skies and the success of both rely on ‘good’ data and common sense.
The very fast look-up AI is here and living among us, sifting through massive amounts of data at lightning speed.
The machine learning, neural network AI is only just beginning to show itself and there are as many question about how and under what circumstances we use it as there are mountains of data ready to be sifted.
The primary reason that we are happy to stop being rude about AI is that even the fast look-up kind is beginning to produce game changing break-throughs, particularly in medicine.
The reason is that that the data sets we (by which I mean extremely clever people in Universities) are asking AI to work with are already there, vast and very clean. The other day, there was a report that said the fight against cancer had had an enormous boost. Researchers asked an AI machine to look at drugs that were used for other things, such as diabetes, and correlate their effect on cancer patients. The results were astonishing – and possible because never before have we had the resources to investigate what were tangential ideas.
In another break-through, researchers at MIT asked AI to look at molecules that they had already looked at some years ago. And they have come up with a molecule that is potentially the answer to super bugs and pathogens we were beginning to despair we would ever combat.
It is all in the data.
For all the good stories, there are ones that catch the eye for different reasons.
Some months ago a couple of banks got into the news because decisions about credit cards and loans were gender biased. The banks said that is impossible but actually the stories were right.
They were right because the data sets that the AI machines were given to help them make the decisions were gender biased. Historically, men made more financial decisions than women, their traits and financial histories were a much richer source of data and thus the bias towards men. It does not mean that men are better financial decision makers than women (far from it, let’s face it) but that the data the AI was working on had a bias in it from the start, that no-one spotted until the data had been processed.
We are hereby happy to stop being rude about AI but please, oh please, let’s have the real stories and not ever more hype to sift through.