## Main result

We now prove that a left inverse of a square matrix is also a right inverse. In other words, we show the following:

Let $$A, N \in \mathbb{F}^{n\times n}$$ where $$\mathbb{F}$$ denotes a field. If $$NA = I$$, then $$AN = I$$.

Before we look at the proof, note that the above statement also establishes that a right inverse is also a left inverse because we can view $$A$$ as the right inverse of $$N$$ (as $$NA = I$$) and the conclusion asserts that $$A$$ is a left inverse of $$N$$ (as $$AN = I$$).

To prove the above statement, we first establish the claim that $$Ax = y$$ has a solution for all $$y \in \mathbb{R}^n$$.

The claim is not true if $$A$$ does not have a left inverse. We postpone the proof of this claim to the end. Let's see how we can use this claim to prove the main result.

Take an arbitrary element in $$\mathbb{F}^n$$ and call it $$y$$. Then the above result tells us that there is $$x' \in \mathbb{F}$$ such that $$Ax' = y$$. Multiplying both sides on the left by $$N$$, we get that $$N(Ax') = Ny$$, giving $$(NA)x' = Ny$$ by associativity of matrix multiplication. As $$NA = I$$, we have $$x' = Ny$$.

Hence, $$y = Ax' = A(Ny) = (AN)y$$. Let $$D$$ denote the product $$AN$$. So $$y = Dy$$. But $$y$$ is arbitrary. We must have $$D = I$$.

### Proof of claim

Suppose that there exists $$y' \in \mathbb{F}^n$$ such that $$Ax = y'$$ has no solution. Now, row reduce $$[A~y']$$ to $$[R~d]$$ where $$R$$ is in reduced row-echelon form.

As $$Ax = y'$$ has no solution, there must be an $$i$$ such that row $$i$$ of $$R$$ has all 0's and $$d_i \neq 0$$. Hence, because $$R$$ is a square matrix, not every column of $$R$$ can be a pivot column. So there is at least one free variable, implying that there is a nonzero $$\tilde{x} \in N(A)$$ (i.e., $$Ax' = 0_n$$). But $$\tilde{x} = I \tilde{x} = (NA)\tilde{x} = N(A\tilde{x}) = N 0_n = 0_n$$, contradicting that $$\tilde{x}$$ is nonzero! So the assumption that there exists $$y' \in \mathbb{F}^n$$ such that $$Ax = y'$$ has no solution cannot be true.

## Exercise

Let $$A = \begin{bmatrix} 2 & 0 \\ -1 & 0 \\ 1 & 1\end{bmatrix}$$ and let $$N = \begin{bmatrix} 1 & 1 & 0\\ -1 & -1 & 1 \end{bmatrix}$$. Compute the products $$NA$$ and $$AN$$. Do the answers contradict our main result? If not, why not?