Computation of Eigenvectors
Let A be a square matrix of order n and
one of its eigenvalues. Let X be an eigenvector of A associated to
.
We must have
This is a linear system for which the matrix coefficient is
.
Since the zero-vector is a solution, the system is consistent. In fact, we will
in a different page that the structure of the solution set of this system is
very rich. In this page, we will basically discuss how to find the solutions.
Remark. It is quite easy to notice that if X is a vector which
satisfies
,
then the vector Y = c X (for any arbitrary number c)
satisfies the same equation, i.e.
.
In other words, if we know that X is an eigenvector, then cX is
also an eigenvector associated to the same eigenvalue.
Let us start with an example.
Example. Consider the matrix
First we look for the eigenvalues of A. These are given by the
characteristic equation
,
i.e.
If we develop this determinant using the third column, we obtain
Using easy algebraic manipulations, we get
which implies that the eigenvalues of A are 0, -4, and 3.
Next we look for the eigenvectors.
1.
Case
:
The associated eigenvectors are given by the linear system
which may be rewritten by
Many ways may be used to solve this system. The third equation is identical
to the first. Since, from the second equations, we have y = 6x,
the first equation reduces to
13x + z = 0. So this system is equivalent to
So the unknown vector X is given by
Therefore, any eigenvector X of A associated to the eigenvalue
0 is given by
where c is an arbitrary number.
2.
Case
:
The associated eigenvectors are given by the linear system
which may be rewritten by
In this case, we will use elementary operations to solve it. First we
consider the augmented matrix
,
i.e.
Then we use elementary row operations to reduce it to a upper-triangular
form. First we interchange the first row with the first one to get
Next, we use the first row to eliminate the 5 and 6 on the first column. We
obtain
If we cancel the 8 and 9 from the second and third row, we obtain
Finally, we subtract the second row from the third to get
Next, we set z = c. From the second row, we get
y = 2z = 2c. The first row will imply
x = -2y+3z = -c. Hence
Therefore, any eigenvector X of A associated to the eigenvalue
-4 is given by
where c is an arbitrary number.
2.
Case
:
The details for this case will be left to the reader. Using similar ideas as
the one described above, one may easily show that any eigenvector X
of A associated to the eigenvalue 3 is given by
where c is an arbitrary number.
Remark. In general, the eigenvalues of a matrix are not all distinct
from each other (see the page on the eigenvalues for more details). In the next
two examples, we discuss this problem.
Example. Consider the matrix
The characteristic equation of A is given by
Hence the eigenvalues of A are -1 and 8. For the eigenvalue 8, it is easy
to show that any eigenvector X is given by
where c is an arbitrary number. Let us focus on the eigenvalue -1. The
associated eigenvectors are given by the linear system
which may be rewritten by
Clearly, the third equation is identical to the first one which is also a
multiple of the second equation. In other words, this system is equivalent to
the system reduced to one equation
2x+y + 2z= 0.
To solve it, we need to fix two of the unknowns and deduce the third one. For
example, if we set
and
,
we obtain
.
Therefore, any eigenvector X of A associated to the eigenvalue -1
is given by
In other words, any eigenvector X of A associated to the
eigenvalue -1 is a linear combination of the two eigenvectors
Example. Consider the matrix
The characteristic equation is given by
Hence the matrix A has one eigenvalue, i.e. -3. Let us find the
associated eigenvectors. These are given by the linear system
which may be rewritten by
This system is equivalent to the one equation-system
x - y = 0.
So if we set x = c, then any eigenvector X of A
associated to the eigenvalue -3 is given by
Let us summarize what we did in the above examples.
Summary: Let A be a square matrix. Assume
is an eigenvalue of A. In order to find the associated
eigenvectors, we do the following steps:
1.
Write down the associated linear system
2.
Solve the system.
3.
Rewrite the unknown vector X as a linear combination of known
vectors.
The above examples assume that the eigenvalue
is
real number. So one may wonder whether any eigenvalue is always real. In
general, this is not the case except for symmetric matrices. The proof of this
is very complicated. For square matrices of order 2, the proof is quite easy.
Let us give it here for the sake of being little complete.
Consider the symmetric square matrix
Its characteristic equation is given by
This is a quadratic equation. The nature of its roots (which are the eigenvalues
of A) depends on the sign of the discriminant
Using algebraic manipulations, we get
Therefore,
is a positive number which implies that the eigenvalues of A are real
numbers.
Remark. Note that the matrix A will have one eigenvalue, i.e.
one double root, if and only if
.
But this is possible only if a=c and b=0. In other words,
we have
A = a I2.
|