Least Squares Polynomials
Theorem ( Least-Squares
Polynomial Curve Fitting ). Given the data
points , the
least squares polynomial of degree m of the
form
that fits the n data points is obtained by
solving the following linear system
for the m+1 coefficients
. These
equations are referred to as the "normal equations".
One thing is certain, to find the least squares polynomial the above
linear system must be solved. There are various linear system solvers that could
be used for this task. However, since this is such an important computation,
most mathematical software programs have a built-in subroutine for this
purpose. In Mathematica it is called the " Fit" procedure. Fit[data,
funs, vars] finds a least�squares fit to a list of data as a linear
combination of the functions funs of variables vars.
We will check the "closeness of fit" with the
Root Mean Square or RMS measure for the
"error in the fit."
Mathematica Subroutine (Least Squares Parabola).
Caution for polynomial curve fitting.
Something goes radically wrong if the data is radically "NOT
polynomial." This phenomenon is called "polynomial wiggle." The next example
illustrates this concept.
Linear Least Squares
The linear least-squares problem is stated as follows. Suppose that
data points and
a set of
linearly independent functions
are given. We want to fine
coefficients so
that the function given
by the linear combination
will minimize the sum of the squares of the errors
.
Theorem (Linear Least Squares). The
solution to the linear least squares problem is found by creating the matrix whose
elements are
The coefficients
are found by solving the linear system
where
and .
|