This is a basic fact of Linear Algebra. When a matrix cannot be diagonalized, at least we can obtain a particular well-behaved form. Consider, for simplicity, an endomorphism in a -vector space, and suppose it has only one eigenvalue (if not, we can decompose the vector space in subspaces satisfying this property. Is not elemental).
We will deal with a particular case, for the sake of exposition. Consider a function such that the characteristic polynomial is . Define
We have , by Cayley-Hamilton theorem. Also,
Observe that:
if then , and is an eigenvector.
if , then the pair is independent and:
a)
b)
so they can be part of a basis giving rise to a block .
if , then the set is independent and:
a)
b)
c)
so they can be part of a basis giving rise to a block .
and so on.
Therefore, we have the algebraic multiplicity of the eigenvalue , which is 4 in this case, the dimension of the eigenspace , and the sequence of dimensions of the generalized eigenspaces, defined as , . I.e., we have the sequence:
We have also the following lemma: Lemma. It is satisfied that for every . Proof
For each consider the linear map
Its kernel is
so . Its image lies in , so
But then,
i.e.
We have several cases:
4,4,4,4. Then we have enough eigenvectors, and the matrix is diagonalizable. We have
3,4,4,4. We can take , so we have the pair , which can be extended with two independent vectors . We have in that basis:
2,4,4,4. We can take , so we have the sets and , which constitute a basis in which the matrix is:
1,4,4,4; 1,1,4,4; 1,2,4,4; 1,1,1,4; 1,2,2,4. These cannot happen, because of the lemma above.
1,2,3,4. We can take , and the set is a basis with associated matrix
2,3,3,4. Imposible, since for , the set is independent, and only can belong to . So we would have 5 independent vectors.
2,3,4,4. We can take and the set is independent. We take independent and we obtain a basis with matrix
3,3,4,4. Analogous to case 6. We can take and the set is independent, but only belongs to . Then we would have five independent vectors.
3,3,3,4. Idem. For , the set is independent, and only can belong to . We would have 6 independent vectors.
Old stuff
In general, we have that for an endomorphism
and so .
But we don't always have that . Now, if , i.e., is idempotent, the previous formula holds: for every we have that and conversely, for we can take such that
This is the same as being a projection.
More generally, we have Proposition
If is such that , we have
Proof
First, let's observe that is an isomorphism. This is so because it is surjective and dimensions are equals.
We can construct an isomorphism at the following way:
where is such that ( exists because ).
It is easy to prove injectivity and surjectivity. Only rest to show that is well defined. Let us suppose that, for we get such that . Then, since the restriction of is an isomorphism, .
In the same way, we can prove the following proposition Proposition
For any endomorphism there exists such that
Proof
Observe that
And therefore, there exist such that .
At this point, we only have to apply the previous proposition to the transformation .
Now, we are ready to show the Jordan canonical form. Let be a complex vector space, and . If is decomposable, we take a decomposition , with (invariant subspace) and proceed with every . So assume is indecomposable.
Consider such that . It exists, because we can choose a basis and construct the polynomial equation , and is algebraically closed. So, for certain , has, at least, dimension 1. Moreover, observe that is the only (possibly multiple) solution to the previous polynomial equation
We define , and apply the previous proposition, so we get
for .
Since is indecomposable and (since ) we have that , and therefore is the null transformation and is \emph{nilpotent}. Let be the nilpotency index, and such that but .
We can show that is a basis for . First, observe that they are independent because if
we can apply several times to show that every .
And second, we can show that they span . In fact, let . Observe that:
, since .
, since . Therefore, , because and so .
and therefore . In fact,
In general, .
Since and is, by hypothesis, indecomposable, we conclude that .
It only rests to show the form of the matrix of respect to the basis . But because of the facts of the above enumeration we conclude that
In conclusion: For any endomorphism of we can find a partition of -stables vector subspaces of .
In each of one, has the form of the matrix above (in a particular basis). This is so because, either is a escalar matrix () or is nilpotent, and for nilpotent operators we have the special basis .
By the way: it there exists other decomposition in -stables subspaces, but not satisfying that matrix form. Even the canonical Jordan form is not unique.