Show that the inverse of a square matrix A exists if and only if the
eigenvalues λ1
,λ2
,··· ,λn of A are different from zero. If A
−1
exists
show that its eigenvalues are 1
λ1
,
1
λ2
,···
1
λn
.
If "\\lambda=0" is an eigenvalue of the matrix A, and "X\\ne0" is a corresponding column-eigenvector, then the inverse matrix "A^{-1}" doesn't exist, since "X=IX=(A^{-1}A)X=A^{-1}(AX)=A^{-1}0=0".
Conversely, if "\\lambda=0" is not an eigenvalue of the matrix A, then "\\lambda=0" is not a root of the characteristic polynomial "p_A(\\lambda)=\\det(A-\\lambda I)" of the matrix A. Therefore, "\\det A\\ne 0" and the matrix A is invertible.
Assuming A is invertible, consider the characteristic polynomial "p_{A}(\\lambda)=\\det(A-\\lambda I)" of the matrix A.
"p_{A}(\\lambda)=\\det(A-\\lambda I)=\\det A\\det(I-\\lambda A^{-1})=(-\\lambda)^n\\det A \\det(A^{-1} - \\lambda^{-1}I)=(-\\lambda)^n\\det A \\cdot p_{A^{-1}}(1\/\\lambda)"
From this we can see that "\\lambda" is an eigenvalue of the matrix A if and only if "1\/\\lambda" is an eigenvalue of the matrix "A^{-1}". Moreover, one can see that multiplicity of the eigenvalue "\\lambda" of the matrix A is the same as multiplicity of the eigenvalue "1\/\\lambda" of the matrix "A^{-1}".
So, if "\\lambda_1,\\dots,\\lambda_n" are eigenvalues of the matrix A, then "1\/\\lambda_1,\\dots,1\/\\lambda_n" are eigenvalues of the matrix "A^{-1}".
The assertion is proved.
Comments
Leave a comment