89.3k views
4 votes
Consider the least squares problem minimize f(x)=∥Ax−b∥2/2 where A is an m×n matrix with m≥n, and b is a vector of length m. Assume A has the full column rank.

(i) Write down the first-order necessary condition for optimality. Is this also a sufficient condition?

User Turivishal
by
9.2k points

1 Answer

3 votes

Final answer:

The first-order necessary condition for optimality in the least squares problem is that the gradient of the objective function, f(x), with respect to x should be equal to zero. However, this condition is not sufficient to guarantee optimality. To further determine optimality, you need to verify the second-order necessary condition.

Step-by-step explanation:

The first-order necessary condition for optimality in the least squares problem is that the gradient of the objective function, f(x), with respect to x should be equal to zero. This can be written as ∇f(x) = 0. However, this condition is not sufficient to guarantee optimality.

To further determine optimality, you need to verify the second-order necessary condition, which is that the Hessian matrix of f(x) should be positive definite at the optimal point. If the Hessian matrix is positive definite, then the point is a local minimum.

User Riddhi Shah
by
9.2k points