Skip to content
[New] Concise and Practical AI/ML
  • Pages
    • Preface
    • Artificial Intelligence
      • Concepts
      • High-level Intelligence
    • Maths for ML
      • Calculus
      • Algebra
    • Machine Learning
      • History of ML
      • ML Models
        • ML Model is Better
        • How a Model Learns
        • Boosted vs Combinatory
      • Neuralnet
        • Neuron
          • Types of Neurons
        • Layers
        • Neuralnet Alphabet
        • Heuristic Hyperparams
      • Feedforward
        • Input Separation
      • Backprop
        • Activation Functions
        • Loss Functions
        • Gradient Descent
        • Optimizers
      • Design Techniques
        • Normalization
        • Regularization
          • Drop-out Technique
        • Concatenation
        • Overfitting & Underfitting
        • Explosion & Vanishing
      • icon picker
        Engineering Techniques
    • Methods of ML
      • Supervised Learning
        • Regression
        • Classification
      • Reinforcement Learning
        • Concepts
        • Bellman Equation
        • Q-table
        • Q-network
        • Learning Tactics
          • Policy Network
      • Unsupervised Learning
        • Some Applications
      • Other Methods
    • Practical Cases
    • Ref & Glossary

Engineering Techniques

Model Expansion

Expand Width

Division for Dense Neuron

Duplicate a neuron ‘k’ times, divide weight by k, so dot-product stays the same, backprop still works.

Expand Depth

Residual Layer

Insert residual layer with all zero params (w, b) to maintain correct forward pass, and to create another path for the feed.
Zero params, with (almost) any activation functions → Forward ok
Gradient flows backward ok too.

Model Shrinking

Expanding vs shrinking are like adding more workforce to do the same job, and less workforce can't do the same job thus it's hard.
Model shrinking is harder than expanding, and is usually just approximating, not exact. The operations include:
Reduce neurons in a layer: Neuron merging and update next layer weights. Only valid for identify or ReLU layers.
Reduce a layer:

 
Want to print your doc?
This is not the way.
Try clicking the ··· in the right corner or using a keyboard shortcut (
CtrlP
) instead.