Gallery
[New] Concise and Practical AI/ML
Share
Explore
Techniques

icon picker
Normalization

Importance

Normalisation is the technique in ML to make input values or input values to a hidden layer closer so they have similar meanings, and also it’s required for the network to learn because a network can’t learn infinite range, ie. without normalisation.
Normalization is a must in neuralnet, inputs must be normalized, to reduce the range of values of inputs. If leaving inputs unnormalized, the params must fit to the infinity of possibilities of input values and that’s impossible, and the network can’t be trained.


Share
 
Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.