Background
We will now review some ideas from linear algebra. Proofs of the
theorems are either left as exercises or can be found in any standard text on
linear algebra.We know how to solve n linear equations in n
unknowns.It was assumed that the determinant of the matrix was nonzero and
hence that the solution was unique. In the case of a homogeneous system AX
= 0, the
unique solution is the trivial solution X = 0.If,there
exist nontrivial solutions to AX = 0. Suppose that,and
consider solutions to the homogeneous linear
system
A homogeneous system of equations always has the
trivial solution
.Gaussian
elimination can be used to obtain the reduced row echelon form which will be
used to form a set of relationships between the variables, and a non-trivial
solution.
Example 1.Find the nontrivial solutions
to the homogeneous system
Background for Eigenvalues and Eigenvectors
Definition (Linearly
Independent).The vectorsare
said to be
linearly independent
if the equation
implies that
.If
the vectors are not linearly independent they are said to be linearly
dependent.
Two vectors in
are linearly independent if and only if they are not parallel.Three vectors in
are
linearly independent if and only if they do not lie in the same plane.
Definition (Linearly
Dependent).The vectorsare
said to be
linearly dependent
if there exists a set of numbers
not all zero, such that
.
Theorem.
The vectorsare
linearly dependent if and only if at least one of them is a linear combination
of the others.
A desirable feature for a vector space is the ability to express each
vector as s linear combination of vectors chosen from a small subset of
vectors.This motivates the next definition.
Definition (Basis).Suppose
that
is a set ofmvectors in
.The
setS i s called a
basis
forif
for every vectorthere
exists a unique set
of scalars
so thatXcan be expressed as the linear combination
Theorem.In,any
set ofnlinearly independent vectors forms a basis of.Each
vectoris
uniquely expressed as a linear combination of the basis vectors.
Theorem.Letbe
vectors in.
(i)Ifm>n,then
the vectors are linearly independent.
(ii)Ifm=n,then
the vectors are linearly dependent if and only if,where.
Applications of mathematics sometimes encounter
the following questions:What are the singularities of,whereis
a parameter?What is the behavior of the sequence of vectors?What
are the geometric features of a linear transformation?Solutions for problems
in many different disciplines, such as economics, engineering, and physics, can
involve ideas related to these equations. The theory of eigenvalues and
eigenvectors is powerful enough to help solve these otherwise intractable
problems.
LetAbe
a square matrix of dimensionn
� nand letXbe
a vector of dimensionn.The
productY = AXcan
be viewed as a linear transformation fromn-dimensional
space into itself.We want to find scalarsfor
which there exists a nonzero vectorXsuch
that
(1);
that is, the linear transformationT(X)
= AXmapsXonto
the multiple.When
this occurs, we callXan
eigenvector that corresponds to the eigenvalue,
and together they form the eigenpair
forA.In
general, the scalarand
vectorXcan
involve complex numbers.For simplicity, most of our illustrations will involve
real calculations.However, the techniques are easily extended to the complex
case.The n �
n identity
matrixIcan
be used to write equation (1) in the form
(2).
The significance of equation (2) is that the
product of the matrixand
the nonzero vectorXis
the zero vector!The theorem of homogeneous linear system says that (2) has
nontrivial solutions if and only if the matrixis
singular, that is,
(3).
This determinant can be written in the form
(4)
Definition (Characteristic
Polynomial).When
the determinant in (4) is expanded, it becomes a polynomial of degree
n, which is called
the
characteristic polynomial
(5)
There exist exactly
n roots (not
necessarily distinct) of a polynomial of degree
n.Each rootcan
be substituted into equation (3) to obtain an underdetermined system of
equations that has a corresponding nontrivial solution vector
X.Ifis
real, a real eigenvectorXcan
be constructed. For emphasis, we state the following definitions.
Definition (Eigenvalue).IfAis
andn �
nreal matrix, then
itsneigenvaluesare
the real and complex roots of the characteristic polynomial
.
Definition (Eigenvector).Ifis
an eigenvalue ofAand
the nonzero vectorVhas
the property that
thenVis
called an
eigenvector ofAcorresponding
to the eigenvalue.Together,
this eigenvalue
and eigenvector V
is called an eigenpair.
The characteristic polynomial
can be factored in the form
whereis
called the
multiplicity of the
eigenvalue
.
The sum of the multiplicities of all eigenvalues isn;that
is,
.
The next three results concern the existence of
eigenvectors.
Theorem (Corresponding
Eigenvectors).Suppose thatAis
andn �
nsquare matrix.
(a)For each distinct eigenvaluethere
exists at least one eigenvectorVcorresponding
to.
(b)Ifhas
multiplicityr,then
there exist at mostrlinearly
independent eigenvectorsthat
correspond to.
Theorem (Linearly Independent
Eigenvectors).Suppose
thatAis
andn �
nsquare matrix.If
the eigenvalues
are distinct andare
the k eigenpairs,
then
is a set ofk
linearly independent vectors.
Theorem (Complete Set of
Eigenvectors).Suppose
thatAis
andn �
nsquare matrix.If
the eigenvalues ofAare
all distinct, then there existnnearly
independent eigenvectors.
Finding eigenpairs by hand computations is
usually done in the following manner.The eigenvalueof
multiplicityris
substituted into the equation
.
Then Gaussian elimination can be performed to obtain the row reduced echelon
form, which will involven-kequations
innunknowns,
where.Hence
there arekfree
variables to choose. The free variables can be selected in a judicious manner to
produceklinearly
independent solution vectors
that correspond to.
Free Variables
When the linear system is underdetermined, we needed to introduce free
variables in the proper location. The following subroutine will rearrange the
equations and introduce free variables in the location they are needed.Then
all that is needed to do is find the row reduced echelon form a second
time.This is done at the end of the next example.
|