Gallery
Concise and Practical AI/ML
Share
Explore
Neuron

Input Separations

The separation of inputs (classify that an input is of which kind) is done by nucleus (eg. dot product for basic neuron). The activation function doesn’t do separation, it does output limitation for avoid broad range of output gets saturated; that is why identity activation function doesn’t work well, while ReLU or sigmoid work perfectly.

Linear Separation

A neuron with 2 weights makes linear separation with a line.

Planar Separation

A neuron with 3 weights makes planar separation with a plane.

Hyperplanar Separation

A neuron with 4 weights or more makes hyperplanar separation with a hyperplanar.

Bias Weight

Bias is the additional weight added to every neuron. It is perfectly and absolutely important for a neuron to do separation. Without bias, the separation line/plane/hyperplane go through the origin (crossing point of the axes of feed values of every input) and they can’t separate input in almost all cases.

Share
 
Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.