Importance
Normalisation is the technique in ML to make input values or input values to a hidden layer closer so they have similar meanings, and also it’s required for the network to learn because a network can’t learn infinite range, ie. without normalisation.
Normalization is a must in neuralnet, inputs must be normalized, to reduce the range of values of inputs. If leaving inputs unnormalized, the params must fit to the infinity of possibilities of input values and that’s impossible, and the network can’t be trained.