73.7k views
3 votes
Let λbe an eigenvalue of an invertible matrix A. Show that λ⁻¹ is an eigenvalue of A⁻¹.

1 Answer

5 votes

Final answer:

An eigenvalue λ of an invertible matrix A implies that λ⁻¹ is an eigenvalue of A⁻¹.

This is shown by using A⁻¹ to transform the original eigenvalue equation and arriving at λ⁻¹v = A⁻¹v, proving λ⁻¹ is an eigenvalue of A⁻¹.

Step-by-step explanation:

If λ is an eigenvalue of an invertible matrix A, there exists a non-zero vector v such that Av = λv.

Since A is invertible, we can multiply both sides of this equation by A⁻¹ to get A⁻¹(Av) = A⁻¹(λv).

Simplifying this using the associative property of matrix multiplication gives us (A⁻¹A)v = λ(A⁻¹v), which simplifies to Iv = λ(A⁻¹v) since A⁻¹A is the identity matrix I.

Since Iv = v, we have v = λ(A⁻¹v).

Multiplying both sides by λ⁻¹ gives us λ⁻¹v = A⁻¹v, showing that λ⁻¹ is indeed an eigenvalue of A⁻¹ with eigenvector v.

User Dotnetengineer
by
7.7k points