8.5k views
1 vote
When training a Neural Network by Backpropagation, what happens if the Learning Rate is too high?

1 Answer

5 votes

Final answer:

Using a high learning rate in backpropagation can lead to instability and slow convergence of the neural network.

Step-by-step explanation:

When training a Neural Network by Backpropagation, the learning rate determines how much the weights are adjusted during each iteration of the training process. If the learning rate is too high, it can cause the weights to update too drastically, leading to unstable training and overshooting of the optimal solution. This can result in the model failing to converge or taking longer to converge.

For example, consider a simple neural network with two input features and a single output. If the learning rate is set too high, the weights may update by a large amount in each iteration, causing the model to overshoot the optimal solution and oscillate around it.

In summary, using a learning rate that is too high during backpropagation can lead to instability and slower convergence of the neural network.

User Jneander
by
7.4k points