Artificial intelligence. Depending on who you ask, it's either the future of technology or the end of mankind. One thing's for sure, though; it's become a buzzword, with virtually anything with a microchip being described as "Powered by AI". But what is artificial intelligence, really? And how can it benefit you?
The commonly held understanding of AI has changed a bit over the years. We used to consider the ability to read text as being artificial intelligence; but OCR cracked that nut decades ago, and now we just consider it normal computing. A commonly used phrase is that AI is whatever hasn't been done yet. That's even known as the "AI Effect". AI is a very wide-ranging field of study.
In general, it's used to describe machines mimicking human cognitive functions such as learning and problem-solving. I can't cover the whole shebang, but most of the noise today is centered around the sub-fields of machine learning, and natural language processing.
You may also hear about deep learning. That's sort of like machine learning using neural networks, but it's also getting too deep for this post. Natural language processing is all about having computers understand and synthesize normal human language and speech. It's not the same as just speech recognition. We've had speech recognition for a while but if you used early versions of it you'll know it was rubbish. Not only could it never understand my accent, but you had to speak like a computer to it. You had to say specific words in a specific order, otherwise it couldn't understand you. Compare that to the likes of Amazon Echo or Google Home today. Those virtual assistants don't need you to speak computer to them - they understand your normal speech.
A less fancy, but very common example of this are the chatbots that seem to be popping up in every website's contact page. They're able to understand you typing in your normal vernacular, which is handy because no customer is going to learn your chatbot's syntax just to interact with your website.
All of these chatbots use artificial intelligence, but that doesn't necessarily mean that they behave intelligently. Presumably they're there for lazy people who can't be bothered to read, because all they seem to be able to do is read out the same frequently asked questions that I've already seen. If the answer was there, I wouldn't be trying to contact you!
Maybe that's a good example of why focusing on the technology instead of the customer experience isn't such a great idea. But hey, if the machines do rise up, let's hope Skynet as a chat bot; because that's a fight we can win. We'll have the Terminators recycled into bean tins and Skynet will still be trying to find my parcel! In seriousness though; natural language processing is revolutionizing the way we interact with computer systems, as well as each other.
Just a few years ago the thought of being able to speak to someone in a different country and have your words translated into their language in real time would have been Star Trek territory, now it's just Skype.
The next buzzword in AI. The other aspect of artificial intelligence that's getting a lot of press at the moment is machine learning. Compared to a normal algorithm, machine learning does not follow a defined set of instructions. Instead, it builds a mathematical model, and it uses that to make decisions and predictions. In order to do that it needs to be fed a whole lot of training data in the first place, in order to learn.
It's going to be easier to use an example here, so let's go with one. You're probably familiar with Where's Wally puzzles. If you're American you'll know it as Where's Waldo.
The character Wally is hidden in a page full of lots of other little characters, and it's your job to try to find him. For a human that's conceptually quite simple. You see a picture of Wally, now you recognize him, so you go and look for him. For a computer that's a lot more difficult, because computers work in numbers. So the computer needs to come up with some numbers that will identify Wally, but are distinct enough from every other character on the page, while still recognizing Wally if he's in a different pose or if is partially obscured by another object.
With machine learning you would feed the algorithm lots and lots of images of Wally, and lots and lots of images of not-Wally, and it would compare them and use it to build up a mathematical pattern that it can use to decide whether a given image is or isn't Wally. That idea of pattern recognition and building is key to how machine learning is actually used.
Why would you bother with this? Because that seems like a lot of effort, and even after you build up that pattern with all of that training, it's still not 100% accurate, so... why? Well, it might take a bit of time to train the machine up. Once it's been trained, though, it can process data very quickly and it can do it in vast quantities. That's not even a hypothetical because somebody did actually build a robot that uses machine learning to solve Where's Wally puzzles. It can apparently solve any puzzle in about four and a half seconds. If for some reason you had thousands of Where's Wally puzzles to solve, that would be the way to go. But, it's probably not the most practical example because for anyone who actually enjoys the puzzles, it would kind of defeat the point.
But what if instead of looking for Wally, you are looking for something more useful? Like maybe a fugitive in CCTV footage, or early signs of cancer within medical data? Or what if instead of looking for known results, you're using machine learning to predict outcomes? Because machine learning works in these mathematical patterns, these models it can be used to predict outcomes based on inputs.
So let's take the medical example. Say you had a whole bunch of medical data from people who appeared to be healthy, and then later on went on to develop cancer; and you also had a bunch of medical data from people who appeared to be healthy, and went on to be healthy.
What if we were able to feed that into a machine learning algorithm, and it was able to spot a pattern? It could predict who was likely to develop cancer in the future based on patterns we didn't even know existed; that it was able to calculate. If we can do that - if we can predict who's going to have cancer, we can detect it quicker and that increases the survival rate.
A less dramatic form of machine learning is going on YouTube recommendations. How did you get your recommendations? There are nearly five billion videos viewed on YouTube every single day. Why this video for you? Well, the answer is that YouTube's famous (or maybe infamous) algorithm uses artificial intelligence.
Think about this: how would you measure a good video? Is it the ratio of likes to dislikes? Is it the length of time people tend to stick around and watch it for? Well, both of those can be faked. Even if you watch the video all the way at the end, and liked it, and then left; from YouTube's perspective that might not be a good thing, because you've left the platform - they've lost a visitor. Maybe there's no perfect answer to this; and if YouTube thought they'd find a perfect answer and settled on it, you can guarantee people would immediately try and game the system. Instead, YouTube designed an artificially intelligent algorithm. Instead of telling it how to measure a good video, they give it an objective. Now you can pretty much guarantee that objective is to maximize your time on YouTube, because that's going to maximize their ad revenue.
How does it do that? Well, it ingests all the data available to it, and it uses that to build predictive models. So why was this video suggested for you? Well, the truth is nobody knows. Not even the people who wrote YouTube's algorithm could tell you exactly why this video was chosen, because they didn't write the rules. The algorithm did. All we can say is that the algorithm analyzed this video, other people's reactions to it, your past history on the platform, and it concluded that for some reason this video stood a good chance of sucking you into a black hole of binge-watching on YouTube.