Comprehensive Guide on Column Space in Linear Algebra
Start your free 7-days trial now!
Column space
Let $\boldsymbol{A}$ be any $m\times{n}$ matrix:
The column space or range of an $m\times{n}$ matrix $\boldsymbol{A}$, denoted by $\mathrm{col}(\boldsymbol{A})$, is the span of its column vectors, that is:
Note that span is defined as the set of all the vectors that can be constructed using a linear combination of $\boldsymbol{a}_1$, $\boldsymbol{a}_2$, $\cdots$, $\boldsymbol{a}_n$, that is:
Finding the column space of a matrix (1)
Consider the following matrix:
Find the column space of $\boldsymbol{A}$.
Solution. By definitionlink, the column space of a matrix is the span of its column vectors. The column space of $\boldsymbol{A}$ is:
Notice how the two vectors are linearly independent. By theoremlink, we know that two linearly independent vectors in $\mathbb{R}^2$ span $\mathbb{R}^2$, which means that the column space of $\boldsymbol{A}$ is the entire $\mathbb{R}^2$.
Finding the column space of a matrix (2)
Consider the following matrix:
Find the column space of $\boldsymbol{A}$.
Solution. The column space of $\boldsymbol{A}$ is the span of its column vectors:
By the plus-minus theoremlink, since the column vectors of $\boldsymbol{A}$ are linearly dependent, we can remove one vector and still preserve the same span:
This means that the column space of $\boldsymbol{A}$ is a line traced out by the first column vector. Visually, the column space of $\boldsymbol{A}$ looks like follows:
For instance, the following vector is a particular element of the column space of $\boldsymbol{A}$:
The column space of a matrix is a subspace
The column space of an $m\times{n}$ matrix is a subspace of $\mathbb{R}^m$.
Proof. The column space of an $m\times{n}$ matrix $\boldsymbol{A}$ is defined as the span of its column vectors. Therefore, the column space is a subset of $\mathbb{R}^m$. To show that the column space is a subspace, we must show that the column space of $\boldsymbol{A}$ is closed under addition and scalar multiplication.
Let $\boldsymbol{v}$ and $\boldsymbol{w}$ be any two elements in the column space of $\boldsymbol{A}$. This means that there exist some vectors $\boldsymbol{x}$ and $\boldsymbol{y}$ such that:
Now, let's check if vector $\boldsymbol{v}+\boldsymbol{w}$ is contained in the column space of $\boldsymbol{A}$ like so:
Since $\boldsymbol{A}(\boldsymbol{x}+\boldsymbol{y})$ results in $\boldsymbol{v}+\boldsymbol{w}$, we conclude that $\boldsymbol{v}+\boldsymbol{w}$ is contained in the column space of $\boldsymbol{A}$. This means that the column space of $\boldsymbol{A}$ is closed under addition.
Next, suppose vector $\boldsymbol{v}$ is an element of the column space of $\boldsymbol{A}$. This means that there exists some vector $\boldsymbol{x}$ such that $\boldsymbol{Ax}=\boldsymbol{v}$. Now, consider a scalar $k$. Let's check if $k\boldsymbol{v}$ is included in the column space of $\boldsymbol{A}$ like so:
This means that the column space of $\boldsymbol{A}$ is closed under scalar multiplication. Therefore, by definition, the column space of $\boldsymbol{A}$ is a subspace of $\mathbb{R}^m$. This completes the proof.
Relationship between consistent linear system and column space
A system of linear equations $\boldsymbol{Ax}=\boldsymbol{b}$ is consistentlink if and only if $\boldsymbol{b}$ is contained in the column space of $\boldsymbol{A}$.
Proof. By definitionlink, the column space of an $m\times{n}$ matrix $\boldsymbol{A}$ is the span of its column vectors, that is, the set of all linear combinations of the column vectors:
Also, by definitionlink, if the system of linear equations $\boldsymbol{Ax}=\boldsymbol{b}$ is consistent, then there exists at least one solution $\boldsymbol{x}$ such that the product $\boldsymbol{Ax}$ yields $\boldsymbol{b}$. We know from theoremlink that $\boldsymbol{Ax}$ can be written as a linear combination of the column vectors of $\boldsymbol{A}$, that is:
Again, given that $\boldsymbol{Ax}=\boldsymbol{b}$ is consistent, we know that there exist scalars $x_1$, $x_2$, $\cdots$, $x_n$ that make the above equality hold. Since $\boldsymbol{b}$ can be expressed as a linear combination of the column vectors of $\boldsymbol{A}$, we have that $\boldsymbol{b}$ belongs to the column space of $\boldsymbol{A}$.
Let's now prove the converse, that is, if $\boldsymbol{b}$ is contained in the column space of $\boldsymbol{A}$, then the system $\boldsymbol{Ax}=\boldsymbol{b}$ is consistent. By definitionlink, since $\boldsymbol{b}$ belongs to the column space of $\boldsymbol{A}$, we know that $\boldsymbol{b}$ can be expressed as a linear combination of the column vectors of $\boldsymbol{A}$, that is:
Using the same logic as \eqref{eq:ZIFT38Fn4iG3GO9QloM}, this can be converted into the linear system $\boldsymbol{Ax}=\boldsymbol{b}$. Because we know there exist $x_1$, $x_2$, $\cdots$, $x_n$ that make \eqref{eq:C4O1JExiiANzJPJw14a} hold, we conclude that $\boldsymbol{Ax}=\boldsymbol{b}$ is consistent. This completes the proof.
Pivot columns of a matrix span the column space of the matrix
The columns in matrix $\boldsymbol{A}$ corresponding to the pivot columnslink in the reduced row echelon form of $\boldsymbol{A}$ span the column space of $\boldsymbol{A}$.
Proof. Suppose $\boldsymbol{A}$ is an $m\times{n}$ matrix. The column space of $\boldsymbol{A}$ is defined as the span of its column vectors:
We know from the plus/minus theoremlink that we can remove linearly dependent vectors from the spanning set while preserving the span. From theoremlink, we know that columns in $\boldsymbol{A}$ corresponding to the non-pivot columns can be expressed as a linear combination of the columns in $\boldsymbol{A}$ corresponding to pivot columns. In other words, non-pivot columns are linearly dependent on pivot columns.
Therefore, we can remove all columns in $\boldsymbol{A}$ corresponding to the non-pivot columns from the spanning set while preserving the span. This means that only the columns in $\boldsymbol{A}$ corresponding to the pivot columns are sufficient to span the column space of $\boldsymbol{A}$.
This completes the proof.
Pivot columns of a matrix form a basis for the column space of the matrix
The columns in matrix $\boldsymbol{A}$ corresponding to the pivot columns of the reduced row echelon formlink of $\boldsymbol{A}$ form a basislink for the column space of $\boldsymbol{A}$.
Proof. From theoremlink, we know that the columns in $\boldsymbol{A}$ corresponding to the pivot columns in $\mathrm{rref}(\boldsymbol{A})$ span the column space of $\boldsymbol{A}$. From theoremlink, we know that the columns in $\boldsymbol{A}$ corresponding to the pivot columns in $\mathrm{rref}(\boldsymbol{A})$ are linearly independent. Therefore, by definitionlink of basis, the columns in $\boldsymbol{A}$ corresponding to the pivot columns in $\mathrm{rref}(\boldsymbol{A})$ form a basis for the column space of $\boldsymbol{A}$. This completes the proof.