147k views
1 vote
In the Gradient Descent algorithm, we are more likely to reach the global minimum, if the learning rate is selected to be a large value.

a. True
b. False

User Neubert
by
3.5k points

2 Answers

0 votes

Answer:

false i think.

Step-by-step explanation:

Gradient Descent is more likely to reach a local minima. because starting at different points and just in general having a different starting point, will lead us to a different local minimum( aka the lowest point closest to the starting point). if alpha(the learning rate) is too large, gradient descent may fail to converge and may even diverge.

User Sarah Messer
by
2.9k points
5 votes
False iiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii