Could you clarify what constitutes a spanning set and a basis? Also how does one test to see if a set of vectors is a spanning set and if it is a basis?© BrainMass Inc. brainmass.com December 24, 2021, 10:32 pm ad1c9bdddf
SOLUTION This solution is FREE courtesy of BrainMass!
Let S be a set of vectors defined over certain field (say, the real numbers).
Now we can use these vectors to write a new one by linearly adding them together:
These new vectors form a vector space W
and the set S is called the "spanning set" of W.
For example, look at the set in :
Then all the vectors that can be created from these three vectors are of the form:
If we look at these vectors as Cartesian coordinates in real space, these vectors all lie in the xy plane.
Not only that, any vector in the xy plane can be written as a linear combination of these vectors. Let be an arbitrary vector in the xy plane
So once we choose an arbitrary value for , equation (1.5) tells us what should be the values of the other two coefficients that will "manufacture" the vector w.
Hence the set is a spanning set of W.
Obviously, we can have many spanning sets for the same vector space
We could say that
Is also a spanning set of W (vectors in the xy plane)
again we see that we can write any vector in the xy plane as a linear combination of the vectors of equation (1.6) in this spanning set.
If we want to express the vector in terms of we get:
We simply chose and used equation (1.5).
By the same token we could have chosen and then we get:
Spanning set is a very broad term. The same vector space can have infinite number of spanning sets.
This is easy to see.
If and then by definition
We can add u to we get a new spanning set and
because for any other vector we have:
So we see that the two spanning sets are equivalent. we can repeat this process ad-infinitum, so we can have an infinite number of spanning sets that span the same vector space.
We see that if we add a linear combination of the spanning vectors to the spanning set, we get a new spanning set that spans the same vector space.
However, this just a trivial trick that gives us no additional information about the vector space.
We would like to know what are the spanning sets that hold the minimal number of vectors that still spans the vector space.
These minimal spanning sets are called "basis".
We can define a basis of vector space W if:
And are linearly independent.
How do we know that a set of vectors form an independent set?
An independent set is defined as:
That is, the linear combination of the basis vectors can be zero if and only if all the scalar coefficients are identically zero.
Inversely, the set is linearly dependent if there exist at least one non-zero scalar coefficient where linear sum of the basis vectors is zero.
This means that a basis set cannot include teh zero vector.
For example, if we look again at the spanning set:
And we want to see if it is linearly independent:
A possible solution is:
So we see that this spanning set is linearly dependent, since we can find non-zero scalar coefficient for a linear combination of the set vectors that result in the zero vector.
Furthermore, we see that
So one of the vectors in the spanning set is a linear combination of the other two.
Hence a set that contains spans the same vector space as
Now, are linearly dependent?
The answer is: These two vectors are linearly independent. As such, they form a basis for all the vectors in the xy plane.
By the same token we can see that:
So we can throw out of the spanning set and we will be left with the basis which we can easily show are, indeed, linearly independent.
So we see that the vector space can have more than one basis. However unlike with the more general spanning sets, when a basis has been chosen for the vector space, the expansion of any vector in that space in term of the basis vectors is unique.
To see this we "assume" that the above statement is not true. That there is a vector and basis such that:
Where for at least one term
But since is a basis, all the vectors are linearly independent and therefore the only way (1.20) can be true is if and only if for every i we have which contradicts our initial assumption that there is at least one term where .
The number of vectors that consist a basis is equivalent to the dimension of the vector space. Actually, the dimension of the vector space is defined by the number of vector in its basis set. In our example, the dimension of the vector space that includes all the vectors in the xy plane is 2, and we need only two vectors in the basis (it is clear that one vector is not enough).
How do we know that a set of vectors are linearly independent?
The idea is to write a general linear combination of the vectors, equate to zero and solve for the coefficients. If we find that at least one of the coefficients is not zero, then the set is dependent. If all of them must be zero for the equality to hold, the set is linearly independent.