Find the inverse matrix of \(\begin{bmatrix} -i & 1 \\ 2 & 0\end{bmatrix}\).
We perform row reduction on \(\left[\begin{array}{rr|rr} -i & 1 & 1 & 0 \\ 2 & 0 & 0 & 1 \end{array}\right].\) \begin{eqnarray*} & & \left[\begin{array}{rr|rr} -i & 1 & 1 & 0 \\ 2 & 0 & 0 & 1 \end{array}\right] \\ & \stackrel{R_1\leftarrow i R_1}{\longrightarrow} & \left[\begin{array}{rr|rr} 1 & i & i & 0 \\ 2 & 0 & 0 & 1 \end{array}\right] \\ & \stackrel{R_2\leftarrow R_2 - 2 R_1}{\longrightarrow} & \left[\begin{array}{rr|rr} 1 & i & i & 0 \\ 0 & -2i & -2i & 1 \end{array}\right] \\ & \stackrel{R_2\leftarrow \frac{i}{2} R_1}{\longrightarrow} & \left[\begin{array}{rr|rr} 1 & i & i & 0 \\ 0 & 1 & 1 & \frac{i}{2} \end{array}\right] \\ & \stackrel{R_1\leftarrow R_1 - i R_2}{\longrightarrow} & \left[\begin{array}{rr|rr} 1 & 0 & 0 & \frac{1}{2} \\ 0 & 1 & 1 & \frac{i}{2} \end{array}\right] \end{eqnarray*} So the inverse matrix is \(\begin{bmatrix} 0 & \frac{1}{2} \\ 1 & \frac{i}{2} \end{bmatrix}\).
Let \(A = \begin{bmatrix} 1 & 0 & 1 \\ 1 & 1 & 0 \\ 0 & 1 & 0\end{bmatrix}\) be defined over \(GF(2)\).
Find \(A^{-1}\).
We perform row reduction on \(\left[\begin{array}{rrr|rrr} 1 & 0 & 1 & 1 & 0 & 0 \\ 1 & 1 & 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 & 0 & 1 \end{array}\right]:\) \begin{eqnarray*} & & \left[\begin{array}{rrr|rrr} 1 & 0 & 1 & 1 & 0 & 0 \\ 1 & 1 & 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 & 0 & 1 \end{array}\right] \\ & \stackrel{R_2 \leftarrow R_2 + R_1}{\longrightarrow} & \left[\begin{array}{rrr|rrr} 1 & 0 & 1 & 1 & 0 & 0 \\ 0 & 1 & 1 & 1 & 1 & 0 \\ 0 & 1 & 0 & 0 & 0 & 1 \end{array}\right] \\ & \stackrel{R_2 \leftrightarrow R_3}{\longrightarrow} & \left[\begin{array}{rrr|rrr} 1 & 0 & 1 & 1 & 0 & 0 \\ 0 & 1 & 0 & 0 & 0 & 1 \\ 0 & 1 & 1 & 1 & 1 & 0 \end{array}\right] \\ & \stackrel{R_3 \leftarrow R_3 + R_2}{\longrightarrow} & \left[\begin{array}{rrr|rrr} 1 & 0 & 1 & 1 & 0 & 0 \\ 0 & 1 & 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 1 & 1 & 1 \end{array}\right] \\ & \stackrel{R_1 \leftarrow R_1 + R_3}{\longrightarrow} & \left[\begin{array}{rrr|rrr} 1 & 0 & 0 & 0 & 1 & 1 \\ 0 & 1 & 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 1 & 1 & 1 \end{array}\right] \end{eqnarray*} So \(A^{-1} = \begin{bmatrix} 0 & 1 & 1 \\ 0 & 0 & 1\\ 1 & 1 & 1\end{bmatrix}\).
Write \(A\) as a product of elementary matrices.
Using the sequence of elementary row operations from the previous part, we obtain that \begin{eqnarray*} A & = & \begin{bmatrix} 1 & 0 & 0 \\ 1 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}^{-1} \begin{bmatrix} 1 & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0 \end{bmatrix}^{-1} \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 1 & 1 \end{bmatrix}^{-1} \begin{bmatrix} 1 & 0 & 1 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}^{-1} \\ & = & \begin{bmatrix} 1 & 0 & 0 \\ 1 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} \begin{bmatrix} 1 & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0 \end{bmatrix} \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 1 & 1 \end{bmatrix} \begin{bmatrix} 1 & 0 & 1 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} \end{eqnarray*} because elementary matrices over \(GF(2)\) are all self-inverses.
Show that \(A = \begin{bmatrix} 4 & -1 & 2\\ -2 & 1 & 1\\ 2 & 0 & 3\end{bmatrix}\) is singular.
We need to find a nonzero 3-tuple \(x\) such that \(Ax = 0\). We first row reduce \(A\) as follows:
\begin{eqnarray*} && \begin{bmatrix} 4 & -1 & 2\\ -2 & 1 & 1\\ 2 & 0 & 3\end{bmatrix} \\ & \stackrel{R_3 \leftarrow R_3 + R_2}{\longrightarrow} & \begin{bmatrix} 4 & -1 & 2\\ -2 & 1 & 1\\ 0 & 1 & 4\end{bmatrix} \\ & \stackrel{R_2 \leftarrow R_2 + \frac{1}{2}R_1}{\longrightarrow} & \begin{bmatrix} 4 & -1 & 2\\ 0 & \frac{1}{2} & 2\\ 0 & 1 & 4\end{bmatrix} \\ & \stackrel{R_2 \leftarrow 2R_2}{\longrightarrow} & \begin{bmatrix} 4 & -1 & 2\\ 0 & 1 & 4\\ 0 & 1 & 4\end{bmatrix} \\ & \stackrel{R_3 \leftarrow R_3 - R_2}{\longrightarrow} & \begin{bmatrix} 4 & -1 & 2\\ 0 & 1 & 4\\ 0 & 0 & 0\end{bmatrix} \\ & \stackrel{R_1 \leftarrow R_1 + R_2}{\longrightarrow} & \begin{bmatrix} 4 & 0 & 6\\ 0 & 1 & 4\\ 0 & 0 & 0\end{bmatrix} \\ & \stackrel{R_1 \leftarrow \frac{1}{4}R_1}{\longrightarrow} & \begin{bmatrix} 1 & 0 & \frac{3}{2}\\ 0 & 1 & 4\\ 0 & 0 & 0\end{bmatrix} \\ \end{eqnarray*} Hence, \(x = \begin{bmatrix} -\frac{3}{2} \\ -4 \\1 \end{bmatrix}\) satisfies \(Ax = 0\), implying that \(A\) is singular.
Simplify the following matrix expression: \(\left(\begin{bmatrix} 1 \\ 2 \end{bmatrix} \begin{bmatrix} 1 & -1 \end{bmatrix} - \begin{bmatrix} 2 & -1 \\ 0 & 1\end{bmatrix}^3\right)^\mathsf{T}.\)
\begin{eqnarray*} \left(\begin{bmatrix} 1 \\ 2 \end{bmatrix} \begin{bmatrix} 1 & -1 \end{bmatrix} - \begin{bmatrix} 2 & -1 \\ 0 & 1\end{bmatrix}^3\right)^\mathsf{T} & = & \left(\begin{bmatrix} 1 & -1 \\ 2 & -2 \end{bmatrix} - \begin{bmatrix} 2 & -1 \\ 0 & 1\end{bmatrix}^3\right)^\mathsf{T}\\ & = & \left(\begin{bmatrix} 1 & -1 \\ 2 & -2 \end{bmatrix} - \begin{bmatrix} 2 & -1 \\ 0 & 1\end{bmatrix} \begin{bmatrix} 2 & -1 \\ 0 & 1\end{bmatrix}^2\right)^\mathsf{T}\\ & = & \left(\begin{bmatrix} 1 & -1 \\ 2 & -2 \end{bmatrix} - \begin{bmatrix} 2 & -1 \\ 0 & 1\end{bmatrix} \left(\begin{bmatrix} 2 & -1 \\ 0 & 1\end{bmatrix} \begin{bmatrix} 2 & -1 \\ 0 & 1\end{bmatrix}\right)\right)^\mathsf{T}\\ & = & \left(\begin{bmatrix} 1 & -1 \\ 2 & -2 \end{bmatrix} - \begin{bmatrix} 2 & -1 \\ 0 & 1\end{bmatrix} \begin{bmatrix} 4 & -3 \\ 0 & 1\end{bmatrix}\right)^\mathsf{T}\\ & = & \left(\begin{bmatrix} 1 & -1 \\ 2 & -2 \end{bmatrix} - \begin{bmatrix} 8 & -7 \\ 0 & 1\end{bmatrix}\right)^\mathsf{T}\\ & = & \begin{bmatrix} -7 & 6 \\ 2 & -3 \end{bmatrix}^\mathsf{T}\\ & = & \begin{bmatrix} -7 & 2 \\ 6 & -3 \end{bmatrix} \end{eqnarray*}
Let \(A = \begin{bmatrix} a & b \\ c & d\end{bmatrix}\) be defined over the real numbers. Prove that \(A\) is invertible if and only if \(ad - bc \neq 0\).
We first show that if \(ad - bc = 0\), then \(A\) is singular and therefore noninvertible.
If \(a = c = 0\), then \(A\) is clearly singular since \(A\begin{bmatrix} 1 \\ 0\end{bmatrix} = \begin{bmatrix} 0 \\ 0\end{bmatrix}\).
Assume that not both \(a\) and \(c\) are \(0\). Then at least one of \(u = \begin{bmatrix} d \\ -c\end{bmatrix}\) and \(v = \begin{bmatrix} b \\ -a \end{bmatrix}\) is not equal to \(\begin{bmatrix} 0 \\0\end{bmatrix}\). But \(Au = \begin{bmatrix} ad - bc \\ cd - dc\end{bmatrix} = \begin{bmatrix} 0 \\0\end{bmatrix}\) and \(Av = \begin{bmatrix} ab - ba \\ cb - da\end{bmatrix} = \begin{bmatrix} 0 \\0\end{bmatrix}\). Hence, \(A\) is singular and is noninvertible.
We now show that if \(ad - bc \neq 0\), then \(A\) is invertible.
Suppose that \(a \neq 0\). Then the following row reduction is valid: \begin{eqnarray*} & & \left[\begin{array}{rr|rr} a & b & 1 & 0 \\ c & d & 0 & 1 \\ \end{array}\right] \\ & \stackrel{R_1 \leftarrow \frac{1}{a} R_1}{\longrightarrow} & \left[\begin{array}{rr|rr} 1 & \frac{b}{a} & \frac{1}{a} & 0 \\ c & d & 0 & 1 \end{array}\right] \\ & \stackrel{R_2 \leftarrow R_2 -c R_1}{\longrightarrow} & \left[\begin{array}{rc|cc} 1 & \frac{b}{a} & \frac{1}{a} & 0 \\ 0 & \frac{ad-bc}{a} & -\frac{c}{a} & 1 \end{array}\right] \\ & \stackrel{R_2 \leftarrow \frac{a}{ad-bc}R_2 }{\longrightarrow} & \left[\begin{array}{rc|cc} 1 & \frac{b}{a} & \frac{1}{a} & 0 \\ 0 & 1 & -\frac{c}{ad-bc} & \frac{a}{ad-bc} \end{array}\right] \end{eqnarray*}
Hence, \(A\) is invertible.
Now, suppose that \(a = 0\). As \(ad -bc \neq 0\), we must have \(c \neq 0\). The details are similar to the previous case and are left as an exercise.
(The solution presented here is rather tedious. We will see that using determinants, we can write down a more elegant proof.)
Let \(\mathbb{F}\) denote a field. Let \(A \in \mathbb{F}^{m\times n}\). Let \(\lambda \in \mathbb{F}\). Prove that \((\lambda A)^\mathsf{T} = \lambda A^\mathsf{T}\).
Let \(B = \lambda A\). We show that \(B^\mathsf{T} = \lambda A^\mathsf{T}\).
Note that the \((i,j)\)-entry of \(B\) is \(\lambda A_{i,j}\). Hence, the \((i,j)\)-entry of \(B^\mathsf{T}\) is the \((j,i)\)-entry of \(B\) and therefore is \(\lambda A_{j,i}\).
Now, the \((i,j)\)-entry of \(\lambda A^\mathsf{T}\) is given by \(\lambda\) times the \((i,j)\)-entry of \(A^\mathsf{T}\), which is \(\lambda A_{j,i}\). This is equal to the \((i,j)\)-entry of \(B^\mathsf{T}\). Thus, \(B^\mathsf{T} = \lambda A^\mathsf{T}\).
Let \(A \in \mathbb{F}^{n\times n}\) where \(n\) is a positive integer and \(\mathbb{F}\) denotes a field. Prove that if \(A\) is nonsingular, then \(A\) is invertible.
Suppose that \(A\) is noninvertible. Let \(R\) be the RREF of \(A\). Then, \(R\) cannot be \(I_n\) and so it has at least one nonpivot column. Thus, there exist a nontrivial solution to \(Rx = 0\). But this \(x\) also satisfies \(Ax = 0\) since \(R\) is obtained from \(A\) via a sequence of elementary row operations. Thus, \(A\) is singular, a contradiction.