Gallery
Concise and Practical AI/ML
Share
Explore

icon picker
What are AI and ML

The Formula

AI = Instincts + ML
Everything comes with data; as known data in instincts, and learnt data by machine learning. The combination of actions done by instincts and ML model is the action done by AI.

Instincts

These instincts are basically known data and known algorithms, pre-coded.

Genetic Programming

Genetic programming may take a big part to form the instincts.

Machine Learning

ML comes with a model. Model is fed with data and learn how to make the output.

Learning Time

Machine learning is about time consuming, models take a lot of time to learn (train) but models perform better than complex algorithms during inference.

Machine Learning Models

Single Model

A model of a single network by itself.

Boosted Model

Boosted model is a model containing multiple weak models of the same kind.

Combinatory Model

Combinatory model is a model containing multiple models of different kinds.

History of Machine Learning

Some names to mention: Newton (calculus), Leibniz (calculus), Cauchy (gradient descent), Rosenblatt (perceptron/neuron), A-confusing-name (backpropagation), Linnainmaa (auto-differentiation), Schmidhuber (LSTM), LeCun (convolutional), Bellman (equation for q-learning), Watkins (q-learning).

The Fathers of Calculus

No doubts, everybody knows in school those are Newton and Leibniz.

The Father of Gradient Descent

Cauchy, a French mathematician found the Gradient Descent method to find local minimum of a function.

The Father of Perceptron and Neuron

The perceptron, which is the basic building block of the neural network, was invented by Frank Rosenblatt in January 1957 at Cornell Aeronautical Laboratory, Inc at Buffalo, New York.

The Father of Backpropagation

Feedforward (forwardpropagation) is the preparation step before backpropagation. Feedforward uses previous layer to calculate next layer but without optimisation, the backpropagation part is truly Dynamic Programming with using previous layer to calculate next layer together with optimising weight values thru’ weight updates.
Rosenblatt got the idea about backpropagation to do error correction but didn’t succeed in implementing it. This algorithm was later invented by some guy but quite confusing who, see this Wikipedia article:
Seppo Linnainmaa later on invented automatic-differentiation which makes the backpropagation completely calculatable by computer without any manual calculation for the derivatives of multiple activation functions and loss function in the network.

The Fathers of Q-learning

Bellman and Chris Watkins made the foundation for q-learning and reinforcement learning.

Later Inventors

LSTM Cell

LSTM cells were invented by Jurgen Schmidhuber, a German computer scientist.

Convolutional Cell

Convolutional cells were invented by Yann LeCun, a French computer scientist.

Share
 
Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.