96.4k views
1 vote
Explain why rescaling of inputs is sometimes necessary for Neural Networks.

1 Answer

3 votes

Final answer:

Rescaling of inputs is sometimes necessary for Neural Networks to improve performance, prevent gradient issues, and enable efficient computation.

Step-by-step explanation:

Scaling or rescaling inputs is sometimes necessary for Neural Networks because it helps to improve model performance and convergence. Here are a few reasons why rescaling is important:

  1. Normalization of features: Neural Networks perform better when input features are in the same range. Rescaling the inputs ensures that all features have a similar scale, which prevents some features from dominating others in the learning process.
  2. Avoiding gradient issues: Rescaling can help prevent gradients from becoming too small or too large during the training process. When features have significantly different scales, the gradients can shrink or explode, which can hinder the learning process.
  3. Efficient computation: Rescaling the inputs can also lead to faster convergence and efficient computation. It helps to bring the input values closer to the range in which the activation functions of neurons operate most effectively.

User Kowalikus
by
7.7k points