Symmetric matrices are found in many applications such
as control theory, statistical analyses, and optimization.
Eigenvalues of real symmetric matrices
Real symmetric matrices have only real eigenvalues.
We will establish the case here.
Proving the general case requires a bit of ingenuity.
Let be a matrix with real entries. Then
for some real numbers
.
The eigenvalues of are all values of
satisfying
Expanding the left-hand-side, we get
The left-hand side is a quadratic in with discriminant
which is a sum of two squares of real numbers and is therefore
nonnegative for all real values .
Hence, all roots of the quadratic
are real and so all eigenvalues of are real.
Orthogonal matrix
Real symmetric matrices not only have real eigenvalues,
they are always diagonalizable.
In fact, more can be said about the diagonalization.
We say that is orthogonal
if .
In other words, is orthogonal if .
If we denote column of by , then
the -entry of is given
by . Since ,
we must have
for all and
for all .
Therefore, the columns of are pairwise orthogonal and each
column has norm 1. We say that the columns of are orthonormal.
A vector in having norm 1 is called a unit vector.
Examples
The identity matrix is trivially orthogonal. Here are two nontrivial
orthogonal matrices:
,
Orthogonal diagonalization
A real square matrix is orthogonally diagonalizable if
there exist an orthogonal matrix and a diagonal matrix
such that . Orthogonalization is used quite
extensively in certain statistical analyses.
An orthogonally diagonalizable matrix is necessarily symmetric.
Indeed, since the transpose of a diagonal matrix is the matrix
itself.
The amazing thing is that the converse is also true: Every real symmetric
matrix is orthogonally diagonalizable.
The proof of this is a bit tricky.
However, for the case when all the eigenvalues are distinct,
there is a rather straightforward proof which we now give.
First, we claim that if is a real symmetric matrix
and and are eigenvectors of with
distinct eigenvalues and , respectively, then
.
To see this, observe that
.
Hence, if , then , contradicting
that they are distinct. This proves the claim.
Now, let be symmmetric with distinct eigenvalues
. Then every eigenspace is spanned
by a single vector; say for the eigenvalue ,
. We may assume that
for .
If not, simply replace with . This step
is called normalization.
Let be an matrix whose th
column is given by .
Let be the diagonal matrix
with as the th diagonal entry.
Then, .
To complete the proof, it suffices to show that .
First, note that the th diagonal entry of
is . Hence, all entries in the
diagonal of are 1.
Now, the -entry of , where , is given by
. As and are eigenvectors with
different eigenvalues, we see that this .
Thus, . Since is a square matrix,
we have .
The above proof shows that in the case when the eigenvalues are distinct,
one can find an orthogonal diagonalization by first diagonalizing the
matrix in the usual way, obtaining a diagonal matrix and an invertible
matrix such that .
Then normalizing each column of to form the matrix ,
we will have .