Are the vectors
\(\begin{bmatrix} 1\\0\\1\end{bmatrix}\),
\(\begin{bmatrix} 1\\-1\\1\end{bmatrix}\),
\(\begin{bmatrix} 1\\1\\1\end{bmatrix}\) linearly independent?
The answer is “No”.
We want to see if
\(\alpha = \beta=\gamma=0\) is the only solution to
\[\alpha \begin{bmatrix} 1\\0\\1\end{bmatrix}+
\beta\begin{bmatrix} 1\\-1\\1\end{bmatrix}+
\gamma\begin{bmatrix} 1\\1\\1\end{bmatrix} =
\begin{bmatrix} 0 \\ 0 \\ 0\end{bmatrix}.\]
By comparing the two sides, we obtain the system
\begin{eqnarray*}
\alpha + \beta + \gamma & = & 0\\
-\beta + \gamma & = & 0 \\
\alpha + \beta + \gamma & = & 0
\end{eqnarray*}
We see that the third equation is superfluous.
Setting \(\gamma = 1\), we obtain \(\beta = 1\) and \(\alpha = -2\).
In other words,
\(-2 \begin{bmatrix} 1\\0\\1\end{bmatrix}
+\begin{bmatrix} 1\\-1\\1\end{bmatrix}+
\begin{bmatrix} 1\\1\\1\end{bmatrix} = \begin{bmatrix} 0 \\ 0\\ 0\end{bmatrix}\).