Some Theorems about Eigenvalues
Matrix with Distinct Eigenvalues
Usually, it is not easy to determine whether a square matrix is diagonalizable or not. But if an n x n matrix has n distinct eigenvalues, then the matrix is diagonalizable, thanks to the following theorem:
Theorem: If are eigenvectors that correspond to distinct eigenvalues of an n x n matrix , then the set is linearly independent.
We can prove the theorem by induction: When , the theorem is trivial as is non-zero. Assume the theorem is true when . Now consider the case when . Assume is linearly dependent. Since the first vectors are linearly independent by the induction hypothesis, we have
----(1)
for some real numbers . Apply the matrix to both sides, we get
----(2)
Multiply (1) by and subtract (2), we get
Since is linearly independent, for . As all eigenvalues are distinct, we have , which implies . This contradicts the fact that is an eignvector of .
Hence, is linearly independent.
Cayley-Hamilton Theorem
Given an n x n matrix , we already knew that its eigenvalues satisfy the characteristic equation . The following famous theorem says that the matrix also satisfies the characteristic equation:
Cayley-Hamilton theorem: .
Recall the example . Its characteristic equation is . By Cayley-Hamilton theorem, we have
(Note: the constant term in the polynomial is regarded as the scalar multiplication of the constant to the identity matrix.)
The full proof of the Cayley-Hamilton theorem is beyond the scope of this course. However, we can easily prove the special case: when is a diagonalizable matrix i.e. , where is a diagonal matrix. First of all, notice that for any non-negative integer ,
And for diagonal matrix , .
Therefore, we have . Since is the characteristic polynomial, for . In other words, .