Up Main page

Let \(V\) be a subspace of \(\mathbb{R}^n\) of dimension \(k\).

We say that a basis \(\{u_1,\ldots, u_k\}\) for \(V\) is an orthonormal basis if for each \(i = 1,\ldots,k\), \(u_i\) is a unit vector (i.e. \(\|u_i\| = 1\), or equivalently, \(u_i\cdot u_i = 1\)), and for all distinct \(i, j \in \{1,\ldots,k\}\), \(u_i\) and \(u_j\) are orthogonal (i.e. \(u_i\cdot u_j = 0\)).

Examples

\(\left\{ \begin{bmatrix} 1\\0\\0\end{bmatrix}, \begin{bmatrix} 0\\1\\0\end{bmatrix}, \begin{bmatrix} 0\\0\\1\end{bmatrix} \right\}\) is an orthonormal basis for \(\mathbb{R}^3\).

\(\left\{ \begin{bmatrix} \frac{2}{\sqrt{5}} \\ \frac{1}{\sqrt{5}} \\ 0 \\ 0\end{bmatrix}, \begin{bmatrix} 0 \\ 0 \\ \frac{1}{\sqrt{2}} \\ -\frac{1}{\sqrt{2}}\end{bmatrix}\right\}\) is an orthonormal basis for the nullspace of \(A = \begin{bmatrix} 1 & -2 & 0 & 0 \\ 0 & 0 & 1 & 1\end{bmatrix}\)

One benefit of having an orthonormal basis \(\{u_1,\ldots,u_k\}\) is that if \(\Gamma\) denotes the ordered basis \((u_1,\ldots,u_k)\), then for \(v \in V\), \([v]_\Gamma\) is given by \(\begin{bmatrix} u_1\cdot v \\ u_2 \cdot v \\ \vdots \\ u_k \cdot v\end{bmatrix}\).

For example, \(v = \begin{bmatrix} 2 \\ 1 \\ -1 \\ 1\end{bmatrix}\) is in the nullspace of \(A\) above. Letting \(\Gamma = (u_1,u_2)\) where \(u_1 = \begin{bmatrix} \frac{2}{\sqrt{5}} \\ \frac{1}{\sqrt{5}} \\ 0 \\ 0\end{bmatrix}\) and \(u_2 = \begin{bmatrix} 0 \\ 0 \\ \frac{1}{\sqrt{2}} \\ -\frac{1}{\sqrt{2}}\end{bmatrix}\), we obtain \[[v]_\Gamma = \begin{bmatrix} \frac{2}{\sqrt{5}}(2) + \frac{1}{\sqrt{5}}(1) + 0 (-1) + 0(1) \\ 0(2) + 0(1)+\frac{1}{\sqrt{2}}(-1) + (-\frac{1}{\sqrt{2}})(1) \\ \end{bmatrix} = \begin{bmatrix} \sqrt{5} \\ -\sqrt{2} \end{bmatrix} .\] One can easily check that \(\sqrt{5} u_1 - \sqrt{2} u_2 = v\).

To construct an orthonormal basis for a subspace of \(\mathbb{R}^n\), one can use the Gram-Schmidt orthonormalization process. The details of this process can be found here.

To see why \([v]_\Gamma = \begin{bmatrix} u_1\cdot v \\ u_2 \cdot v \\ \vdots \\ u_k \cdot v \end{bmatrix}\), let \(\lambda_1,\ldots,\lambda_k\) be scalars such that \(v = \lambda_1 u_1 + \cdots + \lambda_k u_k.\)

Then for each \(i = 1,\ldots, k\), \begin{eqnarray*} u_i \cdot v & = & u_i \cdot (\lambda_1 u_1 + \cdots + \lambda_k u_k) \\ & = & \lambda_1 u_i \cdot u_1 + \cdots \lambda_k u_i \cdot u_k \\ & = & \lambda_1 u_i \cdot u_i \\ & = & \lambda_1 \|u_i\|^2 \\ & = & \lambda_1 \end{eqnarray*}

Quick Quiz