Hands On Neural Networks and Time Series, with Python
During my Bachelor's Degree, my favorite professor told me this:
Once something works well enough, nobody calls it "AI" anymore
This concept goes in the same direction of Larry Tesler who said "AI is whatever hasn't been done yet." The first example of Artificial Intelligence was the calculator, which was (and is) able to do very complex mathematical computations in a fraction of a second while it would take minutes or hours for a human being. Nonetheless, when we talk about AI today we don't think about a calculator. We don't think of it because it simply works incredibly well, and you take it for granted. The Google Search algorithm, which is in many ways much more complex than a calculator, is a form of AI that we use in our everyday lives but we don't even think about it.
So what is really "AI"? When do we stop defining something as AI? The question is pretty complex as, if we really think about it, AI has multiple layers and domains.
It surely has multiple layers of complexity. For example, ChatGPT is more complex than the "digit recognition" 2D CNN proposed by LeCun, both conceptually and computationally, and a simple regression algorithm is far less complex than the digit recognition 2D CNN proposed by LeCun (more on this later).
It surely also has multiple domains. Every time we see a CAPTCHA, we are actively creating the input for a Neural Network to process an image. Every time we interact with a GPT we are building the text input for an NLP algorithm. Every time we say "Alexa turn on the light in the kitchen" we are feeding the audio input for a Neural Network. And while it is true that at the end of the day, everything is nothing but a 0/1 signal for a computer, it is also true that, practically, the Neural Network that processes an image has a completely different philosophy, implementation, and complexity than the one that processes an audio or a text.
This is why companies are looking more and more for specialized Machine Learning Engineers who know how to treat a special kind of data more than others. For example, in my professional career, I have worked more in the time series domain than with anything else. This blog post wants to give the reader an idea of how to use Neural Networks when we are in the time series domain. We will also do it at different levels of complexity. We will start from the most simple Neural Network that we have, which is known as Feed Forward Neural Network, to the most fancy, modern, and complex structure of Transformers.
I don't want to bore any of the readers, and I also know that a lot of you wouldn't find this article useful without any coding, so we are going to translate everything from English to Python.
Let's get started!