Up Main page

Suppose that you are asked to come up with a few invertible \(3\times 3\) matrices whose matrix inverses are not obvious. What would you do?

Certainly, you could look up a few of them in a book or on the internet. You could also write down a few arbitrary \(3\times 3\) matrices and use the method of finding matrix inverse discussed in the previous segment to determine if they are invertible. However, these methods do not seem that effective.

The key result that allows us to generate an arbitrary invertible matrix is the following:

A matrix \(A \in \mathbb{F}^{n\times n}\) where \(\mathbb{F}\) is a field and \(n\) is a positive integer is invertible if and only if \(A\) is a product of elementary matrices in \(\mathbb{F}^{n\times n}\).

For example, \(A = \begin{bmatrix} 1 & 2 \\ 3 & -1 \end{bmatrix}\) is invertible and can be written as the product \(\begin{bmatrix} 1&0\\3& 1\end{bmatrix} \begin{bmatrix} 1&0\\0& -7\end{bmatrix} \begin{bmatrix} 1&2\\0& 1\end{bmatrix}.\) The matrix \(\begin{bmatrix} 1&0\\3& 1\end{bmatrix}\) is the elementary matrix corresponding to the elementary row operation \(R_2 \leftarrow R_2 + 3R_1\). The matrix \(\begin{bmatrix} 1&0\\0& -7\end{bmatrix}\) is the elementary matrix corresponding to the elementary row operation \(R_2 \leftarrow -7R_2\). The matrix \(\begin{bmatrix} 1&2\\0& 1\end{bmatrix}\) is the elementary matrix corresponding to the elementary row operation \(R_1 \leftarrow R_1 + 2R_2\).

With the above result, one can generate an arbitrary invertible matrix simply by starting with an elementary matrix and applying an arbitrary sequence of elementary row operations because multiplying a matrix (to the left) by elementary matrices is the same as performing a sequence of elementary row operations.

Recall that elementary matrices are invertible and the product of invertible matrices is again invertible. Hence, if \(A\) is the product of elementary matrices, then \(A\) is invertible.

We now show that if \(A\) is invertible, then it be written as a product of elementary matrices.

Recall that from the previous segment, if \(A\) is invertible, then the RREF of the matrix \([A ~|~ I]\) is \([I ~|~ A^{-1}]\). Therefore, there exist elementary matrices \(M_1,\ldots,M_k\) such that \(M_k\cdots M_1[A ~|~ I]= [I ~|~ A^{-1}]\), or equivalently \([M_k\cdots M_1A ~|~ M_k\cdots M_1]= [I ~|~ A^{-1}]\). Comparing the columns on the right halves of both sides, we get that \(A^{-1} = M_k\cdots M_1\). Thus, \(A = (A^{-1})^{-1} = (M_k\cdots M_1)^{-1} = M_1^{-1}\cdots M_k^{-1}\), a product of elementary matrices!

Example

Write \(A = \begin{bmatrix} 2 & 2 \\ 1 & 2 \end{bmatrix}\) as a product of elementary matrices.

Solution. The above discussion suggests that we transform \([A ~|~ I]\) to RREF using elementary row operations. However, we do not really need to work with the matrix \([A ~|~ I]\). All we need to know is the sequence of elementary row operations that brings \(A\) to \(I\).

Note that \begin{eqnarray*} \begin{bmatrix} 2 & 1 \\ 2 & 2 \end{bmatrix} & \stackrel{R_1 \leftrightarrow R_2}{\longrightarrow} & \begin{bmatrix} 1 & 2 \\ 2 & 2 \end{bmatrix} \\ & \stackrel{R_2 \leftarrow R_2 - 2R_1}{\longrightarrow} & \begin{bmatrix} 1 & 2 \\ 0 & -2 \end{bmatrix} \\ & \stackrel{R_1 \leftarrow R_1 + R_2}{\longrightarrow} & \begin{bmatrix} 1 & 0 \\ 0 & -2 \end{bmatrix} \\ & \stackrel{R_2 \leftarrow -\frac{1}{2}R_2}{\longrightarrow} & \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}. \end{eqnarray*}

Thus, \begin{eqnarray*} A & = & \begin{bmatrix} 0 & 1 \\ 1 & 0\end{bmatrix}^{-1} \begin{bmatrix} 1 & 0 \\ -2 & 1 \end{bmatrix}^{-1} \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}^{-1} \begin{bmatrix} 1 & 0 \\ 0 & -\frac{1}{2} \end{bmatrix}^{-1} \\ & = & \begin{bmatrix} 0 & 1 \\ 1 & 0\end{bmatrix} \begin{bmatrix} 1 & 0 \\ 2 & 1 \end{bmatrix} \begin{bmatrix} 1 & -1 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 1 & 0 \\ 0 & -2 \end{bmatrix} \end{eqnarray*}

Quick Quiz

Exercises

  1. Write each of the following matrices as a product of elementary matrices.

    1. \(\begin{bmatrix} 2 & 3 \\ 4 & 7 \end{bmatrix}\)  

    2. \(\begin{bmatrix} 0 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \end{bmatrix}\)