We have already seen how polynomials can be used for data fitting and for approximating continuous functions. Since both of these problems are least squares problems, they can be simplified by selecting an orthogonal basis for the class of approximating polynomials. This leads us to the concept of orthogonal polynomials.
In this section, we study families of orthogonal polynomials associated with various inner products on C[a, b]. We will see that the polynomials in each of these classes satisfy a three-term recursion relation. This recursion relation is particularly useful in computer applications. Certain families of orthogonal polynomials have important applications in many areas of mathematics. We will refer to these polynomials as classical polynomials and examine them in more detail. In particular, the classical polynomials are solutions of certain classes of second-order linear differential equations that arise in the solution of many partial differential equations from mathematical physics.
Since the proof of Theorem 5.6.1 was by induction, the Gram–Schmidt process is valid for a denumerable set. Thus, if is a sequence of vectors in an inner product space V and are linearly independent for each n, then the Gram–Schmidt process may be used to form a sequence , where is an orthonormal set and
for each n. In particular, from the sequence , it is possible to construct an orthonormal sequence .
Let P be the vector space of all polynomials and define the inner product on P by
where w(x) is a positive continuous function. The interval can be taken as either open or closed and may be finite or infinite. If, however,
is improper, we require that it converge for every .
If is a sequence of orthogonal polynomials, then
form a basis for .
(i.e., is orthogonal to every polynomial of degree less than n).
Proof
It follows from Theorem 5.5.1 that are linearly independent in . Since dim , these n vectors must form a basis for . Let p(x) be any polynomial of degree less than n. Then
and hence
Therefore, .
∎
If is an orthogonal set in and
then is an orthonormal basis for . Hence, if , then
Similarly, if , then the best least squares approximation to f by the elements of is given by
where are orthogonal polynomials.
Another nice feature of sequences of orthogonal polynomials is that they satisfy a three-term recursion relation.
Let be a sequence of orthogonal polynomials. Let denote the lead coefficient of for each i, and define to be the zero polynomial. Then
where and
Proof
Since form a basis for , we can write
where
For any inner product defined by (1),
In particular,
It follows from Theorem 5.7.1 that if , then
Therefore, (2) simplifies to
This equation can be rewritten in the form
Comparing the lead coefficients of the polynomials on each side of (4), we see that
or
It follows from (4) that
Thus,
It follows from (3) that
and hence, by (5), we have
∎
In generating a sequence of orthogonal polynomials by the recursion relation in Theorem 5.7.2, we are free to choose any nonzero lead coefficient that we want at each step. This is reasonable, since any nonzero multiple of a particular will also be orthogonal to . If we were to choose our to be 1, for example, then the recursion relation would simplify to
Let us now look at some examples. Because of their importance, we will consider the classical polynomials beginning with the simplest, the Legendre polynomials.
The Legendre polynomials are orthogonal with respect to the inner product
Let denote the Legendre polynomial of degree n. If we choose the lead coefficients so that for each n, then the recursion formula for the Legendre polynomials is
By the use of this formula, the sequence of Legendre polynomials is easily generated. The first five polynomials of the sequence are
The Chebyshev polynomials are orthogonal with respect to the inner product
It is customary to normalize the lead coefficients so that and for . The Chebyshev polynomials are denoted by and have the interesting property that
This property, together with the trigonometric identity
can be used to derive the recursion relations
The Legendre and Chebyshev polynomials are both special cases of the Jacobi polynomials. The Jacobi polynomials are orthogonal with respect to the inner
product
where .
The Hermite polynomials are defined on the interval . They are orthogonal with respect to the inner product
The recursion relation for Hermite polynomials is given by
The Laguerre polynomials are defined on the interval (0, ∞) and are orthogonal with respect to the inner product
where . The recursion relation for the Laguerre polynomials is given by
The Chebyshev, Hermite, and Laguerre polynomials are compared in Table 5.7.1.
Chebyshev | Hermite | Laguerre |
If is a sequence of orthogonal polynomials with respect to the inner product (1), then the zeros of are all real and distinct and lie in the interval .
Proof
Let be the zeros of that lie in and for which changes sign. Thus, must have a factor of , where is odd, for , m. We may write
where does not change sign on and for . Clearly, . We will show that . Let
The product
will involve only even powers of for each i and hence will not change sign on . Therefore,
Since is orthogonal to all polynomials of degree less than n, it follows that .
∎
Numerical integration formulas of the form (7), where the are roots of orthogonal polynomials, are called Gaussian quadrature formulas. The proof of exactness for polynomials of degree less than 2n can be found in most undergraduate numerical analysis textbooks.
Actually, it is not necessary to perform n integrations to calculate the quadrature coefficients . They can be determined by solving an linear system. Exercise 16 illustrates how this is done when the roots of the Legendre polynomial are used in a quadrature rule for approximating
3.144.48.135