4.4 Bases and Dimension for Vector Spaces

An especially useful way of describing the solution space of a homogeneous linear system is to list explicitly a set S of solution vectors such that every solution vector is a unique linear combination of these particular ones. The following definition specifies the properties of such a set S of “basic” solution vectors, and the concept is equally important for vector spaces other than solution spaces.

In short, a basis for the vector space V is a linearly independent spanning set of vectors in V. Thus, if S={v1,v2,,vn}S={v1,v2,,vn} is a basis for V, then any vector w in V can be expressed as a linear combination

w=c1v1+c2v2++cnvn
w=c1v1+c2v2++cnvn
(1)

of the vectors in S, and we saw in Section 4.3 that the linear independence of S implies that the coefficients c1,c2,,cnc1,c2,,cn in (1) are unique. That is, w cannot be expressed differently as a linear combination of the basis vectors v1,v2,,vnv1,v2,,vn.

Example 1

The standard basis for RnRn consists of the unit vectors

e1=(1,0,0,,0),e2=(0,1,0,,0),,en=(0,0,0,,1).
e1=(1,0,0,,0),e2=(0,1,0,,0),,en=(0,0,0,,1).

If x=(x1,x2,,xn)x=(x1,x2,,xn) is a vector in RnRn, then

x=x1e1+x2e2++xnen.
x=x1e1+x2e2++xnen.

Thus the column vectors

e1=[100],e2=[010],,en=[001]
e1=100,e2=010,,en=001

of the n×nn×n identity matrix span RnRn, and we noted in Example 4 of Section 4.3 that these standard unit vectors are linearly independent.

Example 2

Let v1,v2,,vnv1,v2,,vn be n linearly independent vectors in RnRn. We saw in Section 4.3 that any set of more than n vectors in RnRn is linearly dependent. Hence, given a vector w in RnRn, there exist scalars c,c1,c2,,cnc,c1,c2,,cn not all zero such that

cw+c1v1+c2v2++cnvn=0.
cw+c1v1+c2v2++cnvn=0.
(2)

If c were zero, then (2) would imply that the vectors v1,v2,,vnv1,v2,,vn are linearly dependent. Hence c0c0, so Eq. (2) can be solved for w as a linear combination of v1,v2,,vnv1,v2,,vn. Thus the linearly independent vectors v1,v2,,vnv1,v2,,vn also span RnRn and therefore constitute a basis for RnRn.

Example 2 shows that any set of n linearly independent vectors in RnRn is a basis for RnRn. By Theorem 2 in Section 4.3, we can therefore determine whether n given vectors v1,v2,,vnv1,v2,,vn form a basis for RnRn by calculating the determinant of the n×nn×n matrix

A=[v1v2vn]
A=[v1v2vn]

with these vectors as its column vectors. They constitute a basis for RnRn if and only if detA0detA0.

Example 3

Let v1=(1,1,2,3), v2=(1,1,2,3), v3=(1,1,3,2)v1=(1,1,2,3), v2=(1,1,2,3), v3=(1,1,3,2), and v4=(0,3,1,2)v4=(0,3,1,2). Then we find that

|1110111322313322|=300,
1123112311320312=300,

so it follows that {v1,v2,v3,v4}{v1,v2,v3,v4} is a basis for R4R4.

Theorem 1 has the following import: Just as in RnRn, a basis for any vector space V contains the largest possible number of linearly independent vectors in V.

Now let S={v1,v2,,vn}S={v1,v2,,vn} and T={w1,w2,,wm}T={w1,w2,,wm} be two different bases for the same vector space V. Because S is a basis and T is linearly independent, Theorem 1 implies that mnmn. Next, reverse the roles of S and T; the fact that T is a basis and S is linearly independent implies similarly that nmnm. Hence m=nm=n, so we have proved the following theorem for any vector space with a finite basis.

A nonzero vector space V is called finite dimensional provided that there exists a basis for V consisting of a finite number of vectors from V. In this case the number n of vectors in each basis for V is called the dimension of V, denoted by n=dimVn=dimV. Then V is an n-dimensional vector space. The standard basis of Example 1 shows that RnRn is, indeed, an n-dimensional vector space.

Note that the zero vector space {0}{0} has no basis because it contains no linearly independent set of vectors. (Sometimes it is convenient to adopt the convention that the null set is a basis for {0}{0}.) Here we define dim{0}dim{0} to be zero. A nonzero vector space that has no finite basis is called infinite dimensional. Infinite dimensional vector spaces are discussed in Section 4.5, but we include an illustrative example of one here.

Example 4

Let PP be the set of all polynomials of the form

p(x)=a0+a1x+a2x2++anxn,
p(x)=a0+a1x+a2x2++anxn,

where the largest exponent n0n0 that appears is the degree of the polynomial p(x), and the coefficients a0,a1,a2,,ana0,a1,a2,,an are real numbers. We add polynomials in PP and multiply them by scalars in the usual way—that is, by collecting coefficients of like powers of x. For instance, if

p(x)=3+2x+5x3andq(x)=7+4x+3x2+9x4,
p(x)=3+2x+5x3andq(x)=7+4x+3x2+9x4,

then

(p+q)(x)=(3+7)+(2+4)x+(0+3)x2+(5+0)x3+(0+9)x4=10+6x+3x2+5x3+9x4
(p+q)(x)==(3+7)+(2+4)x+(0+3)x2+(5+0)x3+(0+9)x410+6x+3x2+5x3+9x4

and

(7p)(x)=7(3+2x+5x3)=21+14x+35x3.
(7p)(x)=7(3+2x+5x3)=21+14x+35x3.

It is readily verified that, with these operations, PP is a vector space. But PP has no finite basis. For if p1,p2,,pnp1,p2,,pn are elements of PP, then the degree of any linear combination of them is at most the maximum of their degrees. Hence no polynomial in PP of higher degree lies in span{p1,p2,,pn}span{p1,p2,,pn}. Thus no finite subset of PP spans PP, and therefore PP is an infinite dimensional vector space.

Here, our concern is with finite-dimensional vector spaces, and we note first that any proper subspace W of a finite-dimensional vector space V is itself finite dimensional, with dimW<dimVdimW<dimV. For if dimV=ndimV=n, let knkn be the largest integer such that W contains k linearly independent vectors v1,v2,,vkv1,v2,,vk. Then, by the same argument as in Example 2, we see that {v1,v2,,vk}{v1,v2,,vk} is a basis for W, so W is finite dimensional with dimW=kdimW=k, and k<nk<n because W is a proper subspace.

Moreover, an n-dimensional vector space V contains proper subspaces of each dimension k=1,2,,n1k=1,2,,n1. For instance, if {v1,v2,,vn}{v1,v2,,vn} is a basis for V and k<nk<n, then W=span{v1,v2,,vk}W=span{v1,v2,,vk} is a subspace of dimension k. Thus R4R4 contains proper subspaces of dimensions 1, 2, and 3; R5R5 contains proper subspaces of dimensions 1, 2, 3, and 4; and so on. The proper subspaces of RnRn of dimensions 1,2,,n11,2,,n1 are the higher-dimensional analogues of lines and planes through the origin in R3R3.

Suppose that V is an n-dimensional vector space and that S={v1,v2,,vn}S={v1,v2,,vn} is a set of n vectors in V. Then, in order to show that S is a basis for V, it is not necessary to prove both that S is linearly independent and that S spans V, because it turns out (for n vectors in an n-dimensional vector space) that either property of S implies the other. For instance, if S={v1,v2,,vn}S={v1,v2,,vn} is a set of n linearly independent vectors in V, then Theorem 1 and the argument of Example 2 (once again) imply that S spans V and, hence, is a basis for V. This proves part (a) of Theorem 3. The remaining parts are left to the problems. (See Problems 2732.)

Part (c) of Theorem 3 is often applied in the following form: If W is a k-dimensional subspace of the n-dimensional vector space V, then any basis {v1,v2,,vk}{v1,v2,,vk} for W can be “extended” to a basis {v1,v2,,vn}{v1,v2,,vn} for V, consisting of the original basis vectors for W together with nknk additional vectors vk+1,vk+2,,vnvk+1,vk+2,,vn.

Bases for Solution Spaces

We consider now the homogeneous linear system

Ax=0,
Ax=0,
(7)

in which A is an m×nm×n matrix, so the system consists of m equations in the n variables x1,x2,,xnx1,x2,,xn. Its solution space W is then a subspace of RnRn. We want to determine the dimension of W and, moreover, to find an explicit basis for W. Thus we seek a maximal set of linearly independent solution vectors of (7).

Recall the Gaussian elimination method of Section 3.2. We use elementary row operations to reduce the coefficient matrix A to an echelon matrix E and note the (nonzero) leading entries in the rows of E. The leading variables are those that correspond to the columns of E containing the leading entries. The remaining variables (if any) are the free variables. If there are no free variables, then our system has only the trivial solution, so W={0}W={0}. If there are free variables, we set each of them (separately) equal to a parameter and solve (by back substitution) for the leading variables as linear combinations of these parameters. The solution space W is then the set of all solution vectors obtained in this manner (for all possible values of the parameters).

To illustrate the general situation, let us suppose that the leading variables are the first r variables x1,x2,,xrx1,x2,,xr, so the k=nrk=nr variables xr+1,xr+2,,xnxr+1,xr+2,,xn are free variables. The reduced system Ex=0Ex=0 then takes the form

b11x1+b12x2++b1rxr++b1nxn=0b22x2++b2rxr++b2nxn=0brrxr++brnxn=00=00=0
b11x1+b12x2++b1rxr++b1nxnb22x2++b2rxr++b2nxnbrrxr++brnxn00=====00000
(8)

with the last mrmr equations being “trivial.” We set

xr+1=t1,xr+2=t2,,xn=tk
xr+1=t1,xr+2=t2,,xn=tk
(9)

and then solve (by back substitution) the equations in (8) for the leading variables

x1=c11t1+c12t2++c1ktkx2=c21t1+c22t2++c2ktkxr=cr1t1+cr2t2++crktk.
x1x2xr===c11t1+c12t2++c1ktkc21t1+c22t2++c2ktkcr1t1+cr2t2++crktk.
(10)

The typical solution vector (x1,x2,,xn)(x1,x2,,xn) is given in terms of the k parameters t1,t2,,tkt1,t2,,tk by the equations in (9) and (10).

We now choose k particular solution vectors v1,v2,,vkv1,v2,,vk as follows: To get vjvj we set the jth parameter tjtj equal to 1 and set all other parameters equal to zero. Then

vj=(c1j,c2j,,crj,0,,1,,0),
vj=(c1j,c2j,,crj,0,,1,,0),
(11)

with the 1 appearing as the (r+j)th(r+j)th entry. The vectors v1,v2,,vkv1,v2,,vk are the column vectors of the n×kn×k matrix

[c11c12c1kc21c22c2kcr1cr2crk100010001].
c11c21cr1100c12c22cr2010c1kc2kcrk001.
(12)

Because of the presence of the lower k×kk×k identity matrix, it is clear that the vectors v1,v2,,vkv1,v2,,vk are linearly independent. (See Problem 36.) But Eqs. (9) and (10) show that the typical solution vector x is a linear combination

x=t1v1+t2v2++tkvk.
x=t1v1+t2v2++tkvk.
(13)

Therefore, the vectors v1,v2,,vkv1,v2,,vk defined in (11) form a basis for the solution space W of the original system in (7).

The following algorithm summarizes the steps in this procedure.

Example 5

Find a basis for the solution space of the homogeneous linear system

3x1+6x2x35x4+5x5=02x1+4x2x33x4+2x5=03x1+6x22x34x4+x5=0.
3x12x13x1+++6x24x26x2x3x32x35x43x44x4+++5x52x5x5===000.
(14)

Solution

We readily reduce the coefficient matrix A to the echelon form

E=[120230011400000].
E=100200010210340.

The leading entries are in the first and third columns, so the leading variables are x1x1 and x3x3; the free variables are x2, x4x2, x4, and x5x5. To avoid subscripts, we use r, s, and t rather than t1, t2t1, t2, and t3t3 to denote the three parameters. Thus we set

x2=r,x4=s,andx5=t.
x2=r,x4=s,andx5=t.
(15)

Then back substitution in the reduced system

x1+2x22x4+3x5=0x3x4+4x5=0
x1+2x2x32x4x4++3x54x5==00

yields

x1=2r+2s3tandx3=s4t.
x1=2r+2s3tandx3=s4t.
(16)

The equations in (15) and (16) give the typical solution vector (x1,x2,x3,x4,x5)(x1,x2,x3,x4,x5) in terms of the parameters r, s, and t.

With r=1r=1 and s=t=0s=t=0,

we obtainv1=(2,1,0,0,0).
we obtainv1=(2,1,0,0,0).

With s=1s=1 and r=t=0r=t=0,

we obtainv2=(2,0,1,1,0).
we obtainv2=(2,0,1,1,0).

With t=1t=1 and r=s=0r=s=0,

we obtainv3=(3,0,4,0,1).
we obtainv3=(3,0,4,0,1).

Thus the solution space of the system in (14) is a 3-dimensional subspace of R5R5 with basis {v1,v2,v3}{v1,v2,v3}.

4.4 Problems

In Problems 1–8, determine whether or not the given vectors in RnRn form a basis for RnRn.

  1. v1=(4,7), v2=(5,6)v1=(4,7), v2=(5,6)

     

  2. v1=(3,1,2), v2=(6,2,4), v3=(5,3,1)v1=(3,1,2), v2=(6,2,4), v3=(5,3,1)

     

  3. v1=(1,7,3), v2=(2,1,4), v3=(6,5,1), v4=(0,7,13)v1=(1,7,3), v2=(2,1,4), v3=(6,5,1), v4=(0,7,13)

     

  4. v1=(3,7,5,2), v2=(1,1,3,4), v3=(7,11,3,13)v1=(3,7,5,2), v2=(1,1,3,4), v3=(7,11,3,13)

     

  5. v1=(0,7,3), v2=(0,5,4), v3=(0,5,10)v1=(0,7,3), v2=(0,5,4), v3=(0,5,10)

     

  6. v1=(0,0,1), v2=(0,1,2), v3=(1,2,3)v1=(0,0,1), v2=(0,1,2), v3=(1,2,3)

     

  7. v1=(0,0,1), v2=(7,4,11), v3=(5,3,13)v1=(0,0,1), v2=(7,4,11), v3=(5,3,13)

     

  8. v1=(2,0,0,0), v2=(0,3,0,0), v3=(0,0,7,6), v4=(0,0,4,5)v1=(2,0,0,0), v2=(0,3,0,0), v3=(0,0,7,6), v4=(0,0,4,5)

In Problems 9–11, find a basis for the indicated subspace of R3R3.

  1. The plane with equation x2y+5z=0x2y+5z=0.

  2. The plane with equation y=zy=z.

  3. The line of intersection of the planes described in Problems 9 and 10.

In Problems 12–14, find a basis for the indicated subspace of R4R4.

  1. The set of all vectors of the form (a, b, c, d) for which a=b+c+da=b+c+d.

  2. The set of all vectors of the form (a, b, c, d) such that a=3ca=3c and b=4db=4d.

  3. The set of all vectors of the form (a, b, c, d) for which a+2b=c+3d=0a+2b=c+3d=0.

In Problems 15–26, find a basis for the solution space of the given homogeneous linear system.

  1. x12x2+3x3=02x13x2x3=0x12x12x23x2+3x3x3==00

     

  2. x1+3x2+4x3=03x1+8x2+7x3=0x13x1++3x28x2++4x37x3==00

     

  3. x13x2+2x34x4=02x15x2+7x33x4=0x12x13x25x2++2x37x34x43x4==00

     

  4. x1+3x2+4x3+5x4=02x1+6x2+9x3+5x4=0x12x1++3x26x2++4x39x3++5x45x4==00

     

  5. x13x29x35x4=02x1+x24x3+11x4=0x1+3x2+3x3+13x4=0x12x1x1++3x2x23x2+9x34x33x3++5x411x413x4===000

     

  6. x13x210x3+5x4=0x1+4x2+11x32x4=0x1+3x2+8x3x4=0x1x1x1++3x24x23x2++10x311x38x3+5x42x4x4===000

     

  7. x14x23x37x4=02x1x2+x3+7x4=0x1+2x2+3x3+11x4=0x12x1x1+4x2x22x2++3x3x33x3++7x47x411x4===000

     

  8. x12x23x316x4=02x14x2+x3+17x4=0x12x2+3x3+26x4=0x12x1x12x24x22x2++3x3x33x3++16x417x426x4===000

     

  9. x1+5x2+13x3+14x4=02x1+5x2+11x3+12x4=02x1+7x2+17x3+19x4=0x12x12x1+++5x25x27x2+++13x311x317x3+++14x412x419x4===000

     

  10. x1+3x24x38x4+6x5=0x1+2x3+x4+3x5=02x1+7x210x319x4+13x5=0x1x12x1++3x27x2+4x32x310x3+8x4x419x4+++6x53x513x5===000

     

  11. x1+2x2+7x39x4+31x5=02x1+4x2+7x311x4+34x5=03x1+6x2+5x311x4+29x5=0x12x13x1+++2x24x26x2+++7x37x35x39x411x411x4+++31x534x529x5===000

     

  12. 3x1+x23x3+11x4+10x5=05x1+8x2+2x32x4+7x5=02x1+5x2x4+14x5=03x15x12x1+++x28x25x2+3x32x3+11x42x4x4+++10x57x514x5===000

Problems 27 through 36 further explore independent sets, spanning sets, and bases.

  1. Suppose that S is a set of n linearly independent vectors in the n-dimensional vector space V. Prove that S is a basis for V.

  2. Suppose that S is a set of n vectors that span the n-dimensional vector space V. Prove that S is a basis for V.

  3. Let {v1,v2,,vk}{v1,v2,,vk} be a basis for the proper subspace W of the vector space V, and suppose that the vector v of V is not in W. Show that the vectors v1,v2,,vk,vv1,v2,,vk,v are linearly independent.

  4. Use the result of Problem 29 to prove that every linearly independent set of vectors in a finite-dimensional vector space V is contained in a basis for V.

  5. Suppose that the vectors v1,v2,,vk,vk+1v1,v2,,vk,vk+1 span the vector space V and that vk+1vk+1 is a linear combination of v1,v2,,vkv1,v2,,vk. Show that the vectors v1,v2,,vkv1,v2,,vk span V.

  6. Use the result of Problem 31 to prove that every spanning set for a finite-dimensional vector space V contains a basis for V.

  7. Let S be a linearly independent set of vectors in the finite-dimensional vector space V. Then S is called a maximal linearly independent set provided that if any other vector is adjoined to S, then the resulting set is linearly dependent. Prove that every maximal linearly independent set in V is a basis for V.

  8. Let S be a finite set of vectors that span the vector space V. Then S is called a minimal spanning set provided that no proper subset of S spans V. Prove that every minimal spanning set in V is a basis for V.

  9. Let S be a finite set of vectors that span the vector space V. Then S is called a uniquely spanning set provided that each vector in V can be expressed in one and only one way as a linear combination of the vectors in S. Prove that every uniquely spanning set in V is a basis for V.

  10. Apply the definition of linear independence to show directly that the column vectors of the matrix in (12) are linearly independent.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.221.47.203