Constructing a Basis for a Vector Space
Start your free 7-days trial now!
A set in n-dimensional vector space containing more or less than n vectors
Let $V$ be a vector space and let $\{\boldsymbol{v}_1,\boldsymbol{v}_2,\cdots,\boldsymbol{v}_n\}$ be a basis for $V$.
If a set has more than $n$ vectors, then the set is linearly dependent.
If a set has fewer than $n$ vectors, then the set does not span $V$.
Proof. We start by proving the first case. Suppose a set $W$ contains $m$ vectors in $V$ defined below:
Where $m\gt{n}$. To showlink that $W$ is linearly dependent, we must show that there exists a non-zero solution to:
Since $\{\boldsymbol{v}_1,\boldsymbol{v}_2,\cdots,\boldsymbol{v}_n\}$ are basis vectors for V, we know that all vectors in V can be expressed as a linear combination of these vectors. Since the vectors in W are in V, there must exist some:
Consider the homogeneous system of linear equations:
We know from theoremlink that because there are more unknowns than equations ($m\gt{n}$), the linear system has infinitely many non-zero solutions $x_1$, $x_2$, $\cdots$, $x_m$. Since there exists at least one non-zero solution to \eqref{eq:YrWzjiXdbNK669rfWH4}, we conclude that $W$ is linearly dependent.
Let us now prove the second case. Suppose $W$ is set of $m$ vectors in $V$ defined below:
Where $m\lt{n}$. Let's prove that $W$ does not span $V$ by contradiction. Consider the linear combination of $n$ basis vectors $\boldsymbol{v}_1$, $\boldsymbol{v}_2$, $\cdots$, $\boldsymbol{v}_n$ in $V$ below:
Note that we did not specify that $V$ is $n$-dimensional here - we're simply stating that there are $n$ basis vectors in $V$. We will later prove that if $V$ has $n$ basis vectors, then $V$ must be $n$-dimensional. For now though, you can assume that $V$ could be any dimension.
Now, assume that $W$ spans $V$, which means that every vector in $V$ can be expressed as a linear combination of vectors in $W$, that is:
Now, suppose we are solving the following homogenous system of linear equations:
Since $m\lt{n}$ in \eqref{eq:uZdCZy35fVid8nJn2xh}, there are more columns than rows (or more unknowns than equations), which means that the system \eqref{eq:uZdCZy35fVid8nJn2xh} has infinitely many non-zero solutions $x_1$, $x_2$, $\cdots$, $x_n$ by theoremlink.
Now, let's substitute the vectors $\boldsymbol{v}_1$, $\boldsymbol{v}_2$, $\cdots$, $\boldsymbol{v}_n$ from \eqref{eq:mw2Y6ey9JpHDhideRi4} into \eqref{eq:L3YPAqf2SZDnsdBkiDR} to get:
We can reformulate \eqref{eq:L0nggT3E0tYyhqM4HMW} as:
The sums in the brackets of \eqref{eq:KIQKJlzx5XlMM9MP2LK} are given by \eqref{eq:uZdCZy35fVid8nJn2xh}, which means:
Let's summarize what we have derived:
We have already shown that there exists at least one non-zero solution $x_1$, $x_2$, $ \cdots$, $x_n$. Therefore, by definitionlink of linear dependence, we have that $\boldsymbol{v}_1$, $\boldsymbol{v}_2$, $\cdots$, $\boldsymbol{v}_n$ are linearly dependent. This contradicts the fact that $\boldsymbol{v}_1$, $\boldsymbol{v}_2$, $\cdots$, $\boldsymbol{v}_n$ are linearly independent as they are basis vectors. Since we have a contradiction, our assumption that $W$ spans $V$ is incorrect. This proves that $W$ does not span $V$.
Number of bases for a finite-dimensional vector space
All bases for a finite-dimensional vector space have the same number of vectors.
Proof. Consider an $n$-dimensional vector space $V$. In theoremlink, we have shown that:
if a set has more than $n$ vectors, then the set is linearly dependent.
if a set has fewer than $n$ vectors, then the set does not span $V$.
By definition, basis vectors must:
be linearly independent.
span the vector space $V$.
For these properties to be satisfied, there must be exactly $n$ basis vectors. This completes the proof.
Number of basis vectors of an n-dimensional vector space
The number of basis vectors in an $n$-dimensional vector space is exactly $n$. In other words, every basis of $\mathbb{R}^n$ contains exactly $n$ vectors.
Proof. We know from this section that the standard basis for $\mathbb{R}^n$ is:
Here, there are $n$ vectors in this basis. From theorem, we know that all bases for a finite-dimensional vector space have the same number of vectors. Since the standard basis for $\mathbb{R}^n$ has n vectors, we conclude that all bases of $\mathbb{R}^n$ must also have exactly $n$ vectors.
Using this theorem, we can now add one slight detail to theorem - instead of any arbitrary vector space $V$, we now know that $V$ must be $n$-dimensional if $V$ has $n$ basis vectors.
Relationship between dimensionality of vector space and linear dependence and span
Let $V$ be an $n$-dimensional vector space.
If a set has more than $n$ vectors, then the set is linearly dependent.
If a set has fewer than $n$ vectors, then the set does not span $V$.
Showing that a set of vectors is linearly dependent by inspection
Consider a set $S$ containing the following vectors:
Is $S$ linearly dependent or independent?
Solution. These vectors reside in $\mathbb{R}^2$. Because this set contains more than $2$ vectors, we know by theoremlink that the set is linear dependent. This also means that the set is not a basis for $\mathbb{R}^2$.
Showing that a set of vectors does not span a vector space by inspection
Consider a set $S$ containing the following vector:
Give a reason why $S$ does not span $\mathbb{R}^3$.
Solution. By theoremlink, we know that $2$ vectors cannot span $\mathbb{R}^3$. This also means that $S$ is not a basis for $\mathbb{R}^3$.
Dimension of a vector space
The dimension of a finite-dimensional vector space $V$, denoted by $\mathrm{dim}(V)$, is the number of basis vectors for $V$.
Finding the dimension of Rn
Find the dimension of $\mathbb{R}^2$ and $\mathbb{R}^n$.
Solution. Because $\mathbb{R}^2$ has $2$ basis vectors, its dimension is $2$, that is, $\dim(\mathbb{R}^2)=2$. Similarly, $\dim(\mathbb{R}^n)=n$.
Finding the dimension of a spanning set
Consider the following spanning set:
Where vectors $\boldsymbol{v}_1$ and $\boldsymbol{v}_2$ are:
Find the dimension of $H$.
Solution. To find the dimension of $H$, we first must find the basis for $H$. The two vectors $\boldsymbol{v}_1$ and $\boldsymbol{v}_2$ are linearly independent and $H$ is defined to be their span. By definition then, the vectors form a basis for $H$. The basis is therefore:
The dimension is simply the number of vectors in our basis, which means $\text{dim}(H)=2$. This should make sense because all possible linear combinations of $\boldsymbol{v}_1$ and $\boldsymbol{v}_2$ will form a plane, which is a two-dimensional entity.
Plus/Minus theorem
Let $S$ be a non-empty set of vectors in vector space $V$.
Let $S$ span $V$. If we remove a vector from $S$ that is linearly dependent on the other vectors of $S$, then $S$ still spans $V$.
Let $S$ be linearly independent. If we insert a new vector that is outside the span of $S$ into $S$, then $S$ will still remain linearly independent.
Proof. We start by proving the first case. Let $S$ span $V$. Suppose there exists a vector $\boldsymbol{v}$ in $S$ that is linearly dependent on the other vectors of $S$, that is, $\boldsymbol{v}$ can be expressed as a linear combination of vectors in $S$. Removing $\boldsymbol{v}$ does not affect the span of $S$ because $\boldsymbol{v}$ can still be constructed using a linear combination of the remaining vectors in $S$.
Next, we prove the second case. Let $S$ be linearly independent. Suppose there exists a vector $\boldsymbol{v}$ in $V$ such that $\boldsymbol{v}$ is outside the span of $S$. By definition of span, this means that $\boldsymbol{v}$ cannot be constructed using a linear combination of vectors in $V$. Therefore, inserting $\boldsymbol{v}$ into $S$ will keep $S$ linearly independent.
This completes the proof.
Constructing a basis for a vector space
Let $S$ be a finite set of vectors in a finite dimensional vector space $V$.
If $S$ spans $V$ but is not a basis for $V$, then $S$ can be reduced to a basis for $V$.
If $S$ is a linearly independent set but is not a basis for $V$, then $S$ can be expanded to a basis for $V$.
Proof. Let's start by proving the first case. If $S$ spans $V$ but is not a basis of $V$, then by definition of basis, this means that there exists at least one pair of linearly dependent vectors. We remove one of these vectors from $S$. From the plus/minus theoremlink, we know that this new set $S$ still spans vector $V$ because removing a linearly dependent vector from a spanning set does not affect the span. We repeatedly remove any linearly dependent vector from $S$ until we are left with a set of linearly independent vectors that spans $V$ - at this point, $S$ becomes a basis for $V$ by definition.
Next, let's prove the second case. If $S$ is a linearly independent set with $k$ vectors $\boldsymbol{v}_1$, $\boldsymbol{v}_2$, $\cdots$, $\boldsymbol{v}_k$ but is not a basis vector for $V$, then this means that $S$ does not span $V$ by the definition of basis. This means that there exists some $\boldsymbol{v}_{k+1}$ in $V$ that is not in $\mathrm{span}(S)$, that is, $\boldsymbol{v}_{k+1}$ cannot be constructed using the vectors in $S$. From the plus/minus theoremlink, we know that if we insert this vector $\boldsymbol{v}_{k+1}$ into $S$, then $\{\boldsymbol{v}_1,\boldsymbol{v}_2,\cdots,\boldsymbol{v}_{k},\boldsymbol{v}_{k+1}\}$ will still be linearly independent. We continue expanding $S$ in this way until $S$ spans $V$, thereby making $S$ a basis for $V$.
This completes the proof.
Triangular relationship between basis vectors, span and linear dependence
Let $V$ be an $n$-dimensional vector space and let $S$ be a set of $n$ vectors in $V$. Then $S$ is a basis for $V$ if and only if either:
$S$ spans $V$.
or $S$ is linearly independent.
Proof. Let's prove the forward proposition that if $S$ is a basis consisting of $n$ vectors in an $n$-dimensional vector space, then $S$ spans $V$ and $S$ is linearly independent. There is actually nothing to prove here because by definition of basis, $S$ is a linearly independent set that also spans $V$.
Let's now prove the backward proposition that if $S$ is a set of $n$ vectors in an $n$-dimensional vector space $V$ and $S$ spans $V$, then $S$ is a basis for $V$. Since we are given that $S$ spans $V$, we need to show that $S$ is also a linearly independent set. Let's assume the contrary that $S$ is a linearly dependent set. This means that there exists at least one vector that can be expressed as a linear combination of the other vectors in $S$. By the plus/minus theoremlink, we can remove this vector and the remaining $n-1$ vectors must also span $V$. However, this cannot be true because by theoremlink, if a set has fewer than $n$ vectors, then the set does not span $V$. We have a contradiction here, which means that our initial assumption that $S$ is a linearly dependent set is false. In other words, $S$ must be linearly independent. Because $S$ spans $V$ and is also a linearly independent set, then this means that $S$ must be a basis for $V$ by definition.
Next, let's prove that if $S$ is a set of $n$ linearly independent vectors in an $n$-dimensional vector space $V$, then $S$ is a basis for $V$. Since we are given that $S$ is linearly independent, we need to show that $S$ spans $V$. Let's assume the contrary once again that $S$ does not span $V$. This means that there exists at least one vector in $V$ that resides outside the span of $S$, that is, at least one vector in $V$ cannot be expressed as a linear combination of vectors in $S$. By the plus/minus theoremlink, we can add this vector from $S$ while keeping $S$ a linearly independent set. We now have $n+1$ linearly independent vectors in an $n$-dimensional vector space $V$, which is impossible because by theoremlink, if a set has more than $n$ vectors, then the set must be linearly dependent. We have a contradiction here, which means that our assumption that $S$ does not span $V$ is false. Because $S$ is a linearly independent set that spans $V$, we conclude that $S$ must be a basis for $V$ by definition.
This completes the proof.