91.2k views
5 votes
Which of the following statements about optimization is true?

A: If the function g(w) is convex, differentiable, and has exactly one local minimum, then the gradient descent algorithm always converges to that minimum for any choice of step length.

B: The function g(w) = δ||1 - w||^2 + γ, where δ and γ are constants, is always convex and has a unique minimum.

C: For the function g(w) = { w^2, w if w ≤ 0 otherwise }, there exists a step length α such that starting from w = 1, the gradient descent algorithm will take us to the minimum of g(w) in just one step.

D: The function g(w) = ||Xw - y||_1 has continuous first-order gradient everywhere, where X and y are constant data matrix and vector, respectively.

User Duende
by
8.1k points

1 Answer

5 votes

Final answer:

Statement B is true; the function g(w) = δ||1 - w||^2 + γ is always convex and has a unique minimum, due to its parabolic nature. This plays a significant role in optimization and the gradient descent algorithm.

Step-by-step explanation:

The question pertains to optimization and the properties of functions in respect to algorithms such as gradient descent. When addressing the truth of various statements, it is important to consider the characteristics of the functions mentioned and the conditions required for gradient descent to converge properly.

Statement B asserts that the function g(w) = δ||1 - w||^2 + γ, with δ and γ being constants, is always convex and has a unique minimum which is indeed true as it is a parabola shifted vertically and horizontally with a constant factor added, retaining its convex shape.

Understanding the behavior of these functions and their implications on the gradient descent algorithm is crucial for successful optimization in mathematics and related fields.

User Kuvalya
by
7.8k points