Skip to content
[New] Concise and Practical AI/ML
  • Pages
    • Preface
    • Artificial Intelligence
      • Concepts
      • High-level Intelligence
    • Maths for ML
      • icon picker
        Calculus
      • Algebra
    • Machine Learning
      • History of ML
      • ML Models
        • ML Model is Better
        • How a Model Learns
        • Boosted vs Combinatory
      • Neuralnet
        • Neuron
          • Types of Neurons
        • Layers
        • Neuralnet Alphabet
        • Heuristic Hyperparams
      • Feedforward
        • Input Separation
      • Backprop
        • Activation Functions
        • Loss Functions
        • Gradient Descent
        • Optimizers
      • Design Techniques
        • Normalization
        • Regularization
          • Drop-out Technique
        • Concatenation
        • Overfitting & Underfitting
        • Explosion & Vanishing
      • Engineering Techniques
    • Methods of ML
      • Supervised Learning
        • Regression
        • Classification
      • Reinforcement Learning
        • Concepts
        • Bellman Equation
        • Q-table
        • Q-network
        • Learning Tactics
          • Policy Network
      • Unsupervised Learning
        • Some Applications
      • Other Methods
    • Practical Cases
    • Ref & Glossary

Calculus

What are Involved?

Calculus by Newton-Leibniz, notably derivation. The Newton notation (use quote ‘) is neat and fast but harder to imagine, this book uses Leibniz notation (use d/dx) for solving the dynamic programming formula of backpropagation as the chain rule can be seen more clearly.
The
image.png
is read as derivative of f, the process of finding the derivative of a function is call derivation.

Derivative of a Function

Derivative of a function f is a function t which has results as the rates of change at all points of function f. Each result of t at x can be used as coefficient to draw a tangent line to the function f at x.
For example,
image.png
, the derivative of f is:
image.png

Basic Calculus: Derivation

Important Concepts

Variables

When solving derivation, there can be 1 or multiple variables.

With-Respect-To (WRT)

With-respect-to a variable means keeping that variable in solving derivation, the rest of variables are not related and can be thrown out of the formula.

Reference

Core Derivatives

Derivative of a constant:
image.png
Derivative of a line:
image.png
Derivative of a polynomial:
image.png
Derivative of a exponential:
image.png
image.png
More core functions are there but unused in machine learning.

Common Derivative Rules

Sum Rule
image.png
Difference Rule
image.png
Product Rule
image.png
Quotient Rule
image.png
Chain Rule (used when changing the variable that the derivation is taking respect to) ​
image.png
 
Want to print your doc?
This is not the way.
Try clicking the ··· in the right corner or using a keyboard shortcut (
CtrlP
) instead.