Skip to content
Concise and Practical AI/ML
  • Pages
    • Preface
    • What are AI and ML
    • Mathematics Recap
      • Calculus
      • Algebra
    • Libraries to Use
    • Models for ML
    • Methods of ML
    • Neuralnet Alphabet
    • Neuralnet
      • Neuron
        • Types of Neurons
        • Input Separations
        • Activation Functions
      • Layers in Network
      • Loss Functions
      • Gradient Descent
      • Feedforward
      • Backpropagation
      • Optimisers & Training
      • Techniques in ML
        • Normalisation
        • Regularisation
        • Concatenation
        • Boosted & Combinatory
        • Heuristic Hyperparams
      • Problems in Neuralnet
        • Overfitting
        • Explosion and Vanishing
    • Supervised Learning
      • Regression
      • Classification
    • Reinforcement Learning
      • Concepts
      • Learning Tactics
      • Policy Network
      • Bellman Equation
      • Q-table
      • Q-network
    • Unsupervised Learning
      • Some Applications
    • Incremental Learning
    • Case Studies
      • Algorithm Approximator
      • Regression
      • Classification
      • Sequence Learning
      • Pattern Learning
      • Generative
    • Notable Mentions

Activation Functions

Activation functions are for limiting output values, it is to avoid saturated output values into larger range which makes the network hard to learn which is while identify activation performs badly because it doesn’t limit anything.

Identity Activation

The identify activation function is
image.png
and it doesn’t change nor limit the value passed in by the nucleus, dot-product for example.

Unit-step Activation

Unit-step

f(x) =1 if x>=0, =0 otherwise

Half-maximum Unit-step

f(x) =1 if x>0, =.5 if x=0, =0 otherwise

Rectifier Activations

Rectifier activation function usually has a flat section and a rectified (erected) section in function diagram.

ReLU

Rectifier Linear Unit is the most common activation function, it is faster than sigmoid-like functions. It limits, throws one half of the output from nucleus away.

Leaky ReLU

Leaky ReLU is the ReLU function which is not flat on the negative side and it is rising up a bit to avoid vanishing values due to multiplications with zeros.

Sigmoid-like Activations

Sigmoid

Sigmoid is the most common S-shape activation function.

Logistic

Softmax

Softmax is a special case of logistic function.

Hyperbolic Tangent

 
Want to print your doc?
This is not the way.
Try clicking the ··· in the right corner or using a keyboard shortcut (
CtrlP
) instead.