201k views
4 votes
When training a Neural Network by Backpropagation, what happens if the Learning Rate is too low?

User GISHuman
by
8.8k points

1 Answer

4 votes

Final answer:

If the Learning Rate in backpropagation is too low, the Neural Network will converge very slowly, leading to longer training times and potentially causing the model to settle at a suboptimal solution.

Step-by-step explanation:

When training a Neural Network by Backpropagation, if the Learning Rate is too low, the convergence towards the minimum of the loss function will be very slow. This means that the network will require more epochs (iterations over the training dataset) to learn and possibly may never reach the optimal weights if the training is prematurely stopped. A low learning rate makes the weight updates very small, so the neural network slowly adjusts itself towards the expected output. While this might mean that small nuances of the data can be captured if given enough time, it's often impractical in terms of computational resources and time. Conversely, it can inadvertently lead to a higher chance of the model converging to a suboptimal solution if the training process does not have sufficient iterations to reach the global minimum.

User Beerweasle
by
7.8k points