Eigenvalues and eigenvectors are used in many applications such as solving linear differential equations, digital signal processing, facial recognition, Google's original pagerank algorithm, markov chains in random processes, etc.

Let \(n\) be a positive integer.
Let \(A \in \mathbb{C}^{n \times n}\), \(x \in \mathbb{C}^n\) be non-zero,
and \(\lambda \in \mathbb{C}\).
We say \(x\) is an **eigenvector** of \(A\) with
**eigenvalue** \(\lambda\) if \(Ax = \lambda x\).

In other words, if \(x\) is an eigenvector of \(A\), then \(A\) is simply a scalar multiple of \(x\). Note that by definition, the zero vector is NEVER an eigenvector.

Let \(A = \begin{bmatrix} 1 & 2 \\ 1 & 0\end{bmatrix}\). Let \(x = \begin{bmatrix} 2 \\ 1\end{bmatrix}\). Note that \[Ax = \begin{bmatrix} 1\cdot 2 + 2\cdot 1 \\ 1\cdot 2 + 0 \cdot 1\end{bmatrix} = \begin{bmatrix} 4 \\ 2\end{bmatrix} = 2x.\] By definition, \(x\) is an eigenvector of \(A\) with eigenvalue \(2\).

How do we find eigenvalues and eigenvectors?

Suppose that \(x\) is an eigenvector of \(A\) with eigenvalue \(\lambda\). Then \(Ax = \lambda x\) can be written as \[Ax - (\lambda I_n) x = 0.\] (Recall that \(I_n\) is the \(n \times n\) identity matrix.) Rewriting the last equality gives \[(A - \lambda I_n) x = 0.\] Hence, \(x\) is a nonzero vector in the nullspace of \(A - \lambda I_n\). For the example above, we have \(A - 2I_n = \begin{bmatrix} -1 & 2 \\ 1 & -2\end{bmatrix}\), the nullspace of which contains the nonzero vector \(x = \begin{bmatrix} 2 \\ 1\end{bmatrix}\).

In other words, for any eigenvalue \(\lambda\) of \(A\), the matrix
\(A - \lambda I_n\) is singular, implying that its determinant
is zero.
Note that \(\det(A-\lambda I_n)\) is a polynomial in \(\lambda\) of
degree \(n\) and is called the
**characteristic polynomial** of \(A\), denoted by \(p_A\).
(Some books define the characteristic polynomial of
\(A\) as \(\det(\lambda I_n -A)\) instead. Since
\(\lambda I_n -A\) is singular iff \(A - \lambda I_n\) is,
either definition will give the same roots.)

For the example above, \(p_A = \det(A-\lambda I_2) = \left|\begin{matrix} 1 -\lambda & 2 \\ 1 & -\lambda\end{matrix}\right| = (1 -\lambda)(-\lambda) - 2\cdot 1 = \lambda^2 - \lambda - 2 = (\lambda - 2)(\lambda + 1)\). Note that \(-1\) is another root of this polynomial.

Is \(-1\) is an eigenvalue of \(A\)? The answer is “yes”. To see this, we have to find a nonzero vector \(x\) such that \((A - (-1)I_2)x = 0.\) Now, \(A - (-1)I_2 = \begin{bmatrix} 2 & 2 \\ 1 & 1\end{bmatrix}\). By inspection, one sees that \(\begin{bmatrix} 1 \\ -1\end{bmatrix}\) is a nonzero vector in the nullspace of \(A-(-1)I_2\). Hence, \(\begin{bmatrix} 1 \\ -1\end{bmatrix}\) is an eigenvector of \(A\) with eigenvalue \(-1\).

There are no other eigenvalues of \(A\) because we have found all the roots to the polynomial. (A single-variable quadratic polynomial can have no more than two distinct roots.)

In general, every root of the characteristic polynomial is an eigenvalue. If \(\lambda\) is such that \(\det(A-\lambda I_n) = 0\), then \(A- \lambda I_n\) is singular and, therefore, its nullspace has a nonzero vector. Such a vector by definition gives an eigenvector.

Let \(A = \begin{bmatrix} 1 & 2 \\ 0 & 1\end{bmatrix}\).

Write \(p_A\) in the form \(a\lambda^2 + b\lambda + c\) where \(a,b,c\in\mathbb{R}\).

Find all the eigenvalues of \(A\).

For each eigenvalue you have found, give an eigenvector. (Note that the answer is not unique.)

Find all eigenvalues of \(\begin{bmatrix} 7 & 12 & 4 \\ -8 & -13 & -4 \\ 16 & 24 & 7 \end{bmatrix}\).

Let \(A \in \mathbb{C}^{n\times n}\). Let \(v_1,\ldots,v_k\) be eigenvectors of \(A\) with distinct eigenvalues. Prove that \(v_1,\ldots,v_k\) are linearly independent.