This is a Guest Post and does not necessarily reflect the thoughts and opinions of Techzim. We have a strong filtering process of what makes it to our blog and are confident that you’ll enjoy the article below.
I’m probably sure you have seen or read a jillion articles that have a similar title to this one. So what makes this one different? I know what I’m talking about! Not only that, it will give you a direction if you decide to start learning AI; and if you want links to tutorials, we have tutorials on our Harare School Of AI GitHub repository, for those not present, I’ll drop them below the article.
So for those who don’t know what AI is or what it does, let me explain. I’m probably sure you have watched the movie Iron Man. What does that have to do with anything? Well, if you have, then you know Tony Stark (played by Robert Downey Jr.) had a virtual assistant called Jarvis. Basically, that’s where AI is headed (we’re not yet there, we’re close but not yet there). AI is just human beings trying to teach computers how to think either like them (despite the human beings not knowing how they themselves think) or better than them. Jarvis is a collection of fields in AI, which I will explain in a jiffy. I will be using it as reference. And if by some unfortunate events or circumstances, you were unable to watch Iron Man, this Jarvis is basically a virtual personal assistant that can do anything from setting up a meeting, to remotely piloting your weaponized robot suit (if you have one!).
Okay, I mentioned earlier that there are various fields in AI (or maybe I implied it?). With reference to our Jarvis, let me list them with their applications. In order for Jarvis to understand what Tony Stark says, it uses Natural Language Processing (NLP for short). Natural Language Processing is when computers learn to understand human language. This could either be through Speech or Written text. NLP is a very interesting branch of AI (this is the field I’m studying). In this age of chatbots and personal assistants, if you can’t make your own, it would be funny (not really). Why? Because almost everyone is doing it. Google has Google Assistant, Apple has Siri, Amazon has Alexa, Microsoft has Cortana and so on. So if you want to get started learning Natural Language Processing, head over to our Harare School Of AI github repository, and look for the tutorial on NLP, and if it’s too complicated, we can always sit down and discuss how we can make it simpler.
So Jarvis could identify Tony when he entered his lab, as well as read his emails and pass him objects whenever he was building his weaponized robot suit. At one time he was testing flight, then he fell down and crashed, Jarvis took a fire extinguisher (using some extensions, or arms if you can call them that) and fired it on Tony. How did Jarvis know that what it was using was a fire extinguisher? And how could it identify that Tony was actually Tony? This branch that teaches computers to ‘see’, is called Computer Vision. We would be teaching computers to recognize objects and even to identify people’s faces! Now, if you go to www.google.com, and you search for a house, then click the section that says images, you will see pictures with houses in them. Clever right? That’s AI at work right there. It’s called object recognition. We give the computer a jillion images with houses in them and tell it that these are houses. So that next time you give it an image without a house, it will say there is no house. Does that sound complicated? Not before you actually start doing it! The same way works with facial recognition as well. Give the computer a jillion faces of you, a jillion faces of your friends, and so on, telling it who’s who. Then when you give it someone else’s picture, it will say it’s not you. If you ever took a selfie with a friend and tried posting it on Facebook, you’ll be asked if you want to tag that friend. How did they know the friend was in the selfie? Also, did you know that you could teach your AI to read handwritings or even generate its own handwriting? Interesting. I wonder if it can read doctors’ handwriting.
There is another field that most people nowadays associate with mathematics, businesses and statistics. Data analytics! Now, for our introduction to Deep Learning by Dr Panashe Chiurunge, we learnt stock market price prediction using either CNNs (neural networks) or logistic regression. It turns out you could use AI to predict stuff, say maybe price of cooking oil in the next three or four years, or maybe the bond to US$ rate. Cool right? This would be part of data analytics.
So here’s a summary of the steps you should take if you’re really serious about learning AI:
- Learn Python (Computer programming language)
- Choose what you want your AI to do (like recognize cats and dogs or predict the price of cooking oil, make its own music)
- Select the field your choice belongs to (NLP, Computer vision, reinforcement learning, etc…)
- Head over to the Harare School Of AI Github repo or click the links mentioned below somewhere…
- Start copy, pasting code until you now know how it works (cough, cough), and when it doesn’t, ask someone who knows
Now, AI is not just limited to the above-mentioned use cases, I mean Google, Nvidia and Uber have cars that drive themselves! You could detect cancer in images, predict the likelihood of insurance claims, teach a computer to speak (text to speech), or even monitor agricultural produce from satellite images (cough, cough) and a whole lot of other things. A wise man once said, “A journey of a thousand miles begins with a single step.” If you don’t know the direction of that journey you could go round in circles (and this happens in AI).