We call a set of vectors minimal if removing any one of the vectors from the set changes the span.
The set \(\left\{\begin{bmatrix} 1\\0\\1\end{bmatrix}, \begin{bmatrix} 1\\1\\0\end{bmatrix}\right\}\) is minimal because in the span, there is a vector with nonzero second entry as well as a vector with nonzero third entry whereas the span of the first vector alone does not contain any vector with nonzero second entry, and the span of the second vector alone does not contain any vector with nonzero third entry.
The set \(\{v_1, v_2, v_3\}\) where \(v_1,v_2,v_3\) are vectors from some vector space over \(\mathbb{R}\) satisfying \(v_1 = 2v_2 - v_3\) is not minimal. In the span of \(\{v_1,v_2,v_3\}\), every vector can be written in as \(a_1v_1 + a_2 v_2 + a_3 v_3\) for some scalars \(a_1, a_2, a_3\). Since \(v_1 = 2v_2 - v_3\), \(a_1v_1 + a_2 v_2 + a_3 v_3\) can also be written as \[a_1(2v_2 -v_3 ) + a_2 v_2 + a_3 v_3 = (2a_1 + a_2)v_2 + (a_3 - a_1) v_3,\] which is a linear combination of just \(v_2\) and \(v_3\). Hence, every vector in the span can be written as a linear combination of \(v_2\) and \(v_3\). Thus removing \(v_1\) from the set will not change the span.
Let \(V\) be a vector space. Vectors \(v_1,\ldots, v_k\) in \(V\) are said to be linearly independent if there does not exist \(i \in \{1,\ldots,k\}\) such that \(v_i\) can be written as a linear combination of the remaining vectors.
An equivalent way to define linear independence is as follows: \(v_1,\ldots, v_k\) in \(V\) are linearly independent if and only if the equation \(\lambda_1 v_1 + \cdots + \lambda_k v_k = 0_V\) where \(\lambda_1,\ldots,\lambda_k\) are scalars has the unique solution \(\lambda_1=\cdots =\lambda_k = 0\). In other words, the only way to write the zero vector as a linear combination of \(v_1,\ldots,v_k\) is to set all the scalars to 0.
If \(v_1,\ldots,v_k\) are linearly independent, \(\{v_1,\ldots,v_k\}\) is called a linearly independent set.
One can prove that a set of vectors is minimal if and only if it is linearly independent. The details are left as an exercise.
Observe that any set of vectors that contains the zero vector is not linearly independent by definition. (Why?)
Linear independence is a fundamental notion in the study of vector spaces. As we will see, many key results cannot be stated without having first defined this notion.
Show that the vectors \(\begin{bmatrix} 1\\0\\1\end{bmatrix}\), \(\begin{bmatrix} 1\\1\\0\end{bmatrix}\), and \(\begin{bmatrix} 0\\0\\1\end{bmatrix}\) are linearly independent in \(\mathbb{R}^3\).
Consider the equation \[\lambda_1 \begin{bmatrix} 1\\0\\1\end{bmatrix}+ \lambda_2\begin{bmatrix} 1\\1\\0\end{bmatrix}+ \lambda_3\begin{bmatrix} 0\\0\\1\end{bmatrix}= \begin{bmatrix} 0\\0\\0\end{bmatrix}. \] Equivalently, the equation is \(\begin{bmatrix} \lambda_1+\lambda_2\\ \lambda_2\\ \lambda_1+\lambda_3\end{bmatrix}= \begin{bmatrix} 0\\0\\0\end{bmatrix}.\) Hence, we must have \(\lambda_2 = 0\). This implies that \(\lambda_1 = 0\) by comparing the first entries. From the third entries, we get \(\lambda_3= 0\). So \(\lambda_1 = \lambda_2 = \lambda_3 = 0\) is the only solution. We conclude that the vectors are linearly independent.
Show that \(x^2 - 1\) and \(x+1\) are linearly independent vectors in the vector space of polynomials in \(x\) with real coefficients having degree at most \(2\).
Let \(\lambda_1\) and \(\lambda_2\) be real numbers such that \(\lambda_1(x^2-1) + \lambda_2(x+1) = 0\). Rewriting the left-hand side, we obtain \(\lambda_1 x^2+\lambda_2 x + (-\lambda_1 + \lambda_2) = 0\). Since the right-hand side is just \(0\), the coefficient of \(x^2\) and \(x\) must be \(0\), implying that \(\lambda_1 = 0\) and \(\lambda_2 = 0\). By definition, \(x^2 - 1\) and \(x+1\) are linearly independent.
Show that \(\left\{ \begin{bmatrix} 1 \\ 0 \\ -1\end{bmatrix}, \begin{bmatrix} 1 \\ 2 \\ 1\end{bmatrix}, \begin{bmatrix} 1 \\ -1 \\ 1\end{bmatrix} \right\}\) is a linearly independent set in \(\mathbb{Q}^3\).
Show that \(\left \{ \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix}, \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \right \}\) is a linearly independent set in \(\mathbb{R}^{2\times 2}\).
Determine if \(\{ x - 1, x^2 + 2x, x^2 + 2\}\) is a linearly independent set in the vector space of polynomials in \(x\) with real coefficients having degree at most \(2\).
Let \(S=\{v_1,\ldots,v_k\}\) be a set of vectors from some vector space \(V\). Show that \(S\) is minimal if and only if it is linearly independent.
Let \(v_1,\ldots,v_k\) be linearly independent vectors in \(\mathbb{R}^n\) where \(k \lt n\). Let \(u \in \mathbb{R}^n\) be a nonzero vector such that \(u \cdot v_j = 0\) for all \(j = 1,\ldots k\). Prove that \(u\) is not in the span of \(\{v_1,\ldots, v_k\}\). (Hint: Look at the case when \(k = 1\) to get some ideas. Also, look at the cases when \(n = 2\) and \(n = 3\) geometrically.)