## Extending a linearly independent set

Let $$V$$ be a vector space. Suppose that $$v_1,\ldots, v_m$$ are linearly independent vectors in $$V$$. If $$u \in V$$ is not in the span of $$\{v_1,\ldots, v_m\}$$, then $$\{u, v_1,\ldots, v_m\}$$ is linearly independent.

To see this, suppose that $$\{u, v_1,\ldots, v_m\}$$ is not a linearly independent set. Hence, there exist scalars $$\mu, \lambda_1,\ldots, \lambda_m$$ not all equal to zero such that $\mu u + \lambda_1 v_1 + \cdots + \lambda_m v_m = 0_V.$ Note that $$\mu$$ cannot be $$0$$ or else we would have a linear combination of $$v_1,\ldots, v_m$$ with scalars not all equal to $$0$$ that give the zero vector, contradicting that $$v_1,\ldots, v_m$$ are linearly independent. Hence, we can rewrite the equation as $u = \frac{-\lambda_1}{\mu} v_1 + \cdots +\frac{-\lambda_m}{\mu} v_m.$ This contradicts that $$u$$ is not in the span of $$\{v_1, \ldots, v_m\}$$. Therefore, $$\{u, v_1,\ldots, v_m\}$$ must be linearly independent.

The result above shows that one can obtain a basis for $$V$$ by starting with a linearly independent set of vectors and repeatedly adding a vector not in the span of the vectors to the set until it spans $$V$$. The resulting set will be a basis for $$V$$ since it is linearly independent and spans $$V$$.

This also means that if $$B$$ is a linearly independent set of vectors with $$|B| = \dim(V)$$, then $$B$$ must span $$V$$ because otherwise, $$B$$ can be extended to a basis for $$V$$ with more than $$\dim(V)$$ vectors, contradicting that all bases for $$V$$ have the same number of vectors.

### Dimension of a subspace

Let $$W$$ be a subspace of $$V$$. An immediate consequence of the above is that $$\dim(W) \leq \dim(V)$$. To see this, let $$w_1,\ldots,w_m$$ be a basis of $$W$$ where $$m = \dim(W)$$. As $$W$$ is a subspace of $$V$$, $$\{w_1,\ldots,w_m\}$$ is a linearly independent set in $$V$$ and its span, which is simply $$W$$, is contained in $$V$$. Extend this set to $$\{w_1,\ldots,w_m,u_1,\ldots, u_k\}$$ so that it gives a basis for $$V$$. Then $m + k = \dim(V).$ As $$k \geq 0$$, we get $$m \leq \dim(V)$$, with strict inequality if and only if $$W$$ is a proper subspace of $$V$$.

### Example

Suppose that we are asked to extend $$U = \left\{\begin{bmatrix} 1\\1 \\0 \end{bmatrix}, \begin{bmatrix} 1\\0 \\-1 \end{bmatrix}\right\}$$ to a basis of $$\mathbb{R}^3$$. Since the dimension of $$\mathbb{R}^3$$ is three and $$U$$ already contains two linearly independent vectors, all we need to do is to find a vector in $$\mathbb{R}^3$$ that is not in the span of $$U$$. The question is, how do we find such a vector when there are infinitely many vectors in $$\mathbb{R}^3$$ to choose from?

We will do something clever by making the following observation: As $$\mathbb{R}^3$$ is spanned by $$\left\{\begin{bmatrix} 1\\0\\0\end{bmatrix}, \begin{bmatrix} 0\\1\\0\end{bmatrix}, \begin{bmatrix} 0\\0\\1\end{bmatrix}\right\}$$, there must be at least one vector in the set that cannot be written as a linear combination of vectors in $$U$$. Otherwise the span of $$U$$ will contain $$\mathbb{R}^3$$, which is impossible because there are only two vectors in $$U$$.

In other words, we are trying to solve $$Ax = b$$ where the columns of $$A$$ are the vectors in $$U$$ and $$x = \begin{bmatrix} x_1\\x_2\end{bmatrix}$$ for each choice of $$b$$ from $$\left\{\begin{bmatrix} 1\\0\\0\end{bmatrix}, \begin{bmatrix} 0\\1\\0\end{bmatrix}, \begin{bmatrix} 0\\0\\1\end{bmatrix}\right\}$$. Since all three systems have the same coefficient matrix, we can simply row reduce the extended augmented matrix: $\left[ \begin{array}{rr|rrr} 1 & 1 & 1 & 0 & 0\\ 1 & 0 & 0 & 1 & 0\\ 0 & -1 & 0 & 0 & 1 \end{array} \right].$ Interchanging rows 1 and 2, we obtain $\left[ \begin{array}{rr|rrr} 1 & 0 & 0 & 1 & 0\\ 1 & 1 & 1 & 0 & 0\\ 0 & -1 & 0 & 0 & 1 \end{array} \right].$ Performing $$R_2 \leftarrow R_2 - R_1$$ gives $\left[ \begin{array}{rr|rrr} 1 & 0 & 0 & 1 & 0\\ 0 & 1 & 1 & -1 & 0\\ 0 & -1 & 0 & 0 & 1 \end{array} \right].$ Performing $$R_3 \leftarrow R_3 + R_2$$ gives $\left[ \begin{array}{rr|rrr} 1 & 0 & 0 & 1 & 0\\ 0 & 1 & 1 & -1 & 0\\ 0 & 0 & 1 & -1 & 1 \end{array} \right].$ From the bottom row, we see that none of the three vectors can be written as a linear combination of vectors in $$U$$. That means we could add any of them to $$U$$. The resulting set will have three linearly independent vectors and will therefore span $$\mathbb{R}^3$$.

## Exercise

1. Let $$P_2$$ denote the vector space of polynomials in $$x$$ with real coefficients having degree at most $$2$$. Extend $$\{x-1, x^2-x + 2\}$$ to a basis of $$P_2$$.

2. What is the largest possible dimension of a proper subspace of the vector space of $$2 \times 3$$ matrices with real entries?

3. Let $$B = \{v_1,\ldots, v_n\}$$ be a basis of $$\mathbb{R}^n$$. $$B$$ is said to be an orthogonal basis if the dot product of $$v_i$$ and $$v_j$$ is $$0$$ for all distinct $$i, j \in \{1,\ldots, n\}$$. For example, the basis $$\left\{\begin{bmatrix} 1\\0\\0\end{bmatrix}, \begin{bmatrix} 0\\1\\0\end{bmatrix}, \begin{bmatrix} 0\\0\\1\end{bmatrix}\right\}$$ is an orthogonal basis of $$\mathbb{R}^3$$.

1. Extend $$\left\{\begin{bmatrix} 1\\-2\end{bmatrix}\right\}$$ to an orthogonal basis of $$\mathbb{R}^2$$.

2. Extend $$\left\{\begin{bmatrix} 1\\0\\ -1 \end{bmatrix}, \begin{bmatrix} 1\\2\\ 1 \end{bmatrix}\right\}$$ to an orthogonal basis of $$\mathbb{R}^3$$.