Up Main page

Definition

Let A be an n×n matrix. A is said to be symmetric if A=AT.

Examples

[1001], [π112], [123245356]

Symmetric matrices are found in many applications such as control theory, statistical analyses, and optimization.

Eigenvalues of real symmetric matrices

Real symmetric matrices have only real eigenvalues. We will establish the 2×2 case here. Proving the general case requires a bit of ingenuity.

Let A be a 2×2 matrix with real entries. Then A=[abbc] for some real numbers a,b,c. The eigenvalues of A are all values of λ satisfying |aλbbcλ|=0. Expanding the left-hand-side, we get λ2(a+c)λ+acb2=0. The left-hand side is a quadratic in λ with discriminant (a+c)24ac+4b2=(ac)2+4b2 which is a sum of two squares of real numbers and is therefore nonnegative for all real values a,b,c. Hence, all roots of the quadratic are real and so all eigenvalues of A are real.

Orthogonal matrix

Real symmetric matrices not only have real eigenvalues, they are always diagonalizable. In fact, more can be said about the diagonalization.

We say that URn×n is orthogonal if UTU=UUT=In. In other words, U is orthogonal if U1=UT.

If we denote column j of U by uj, then the (i,j)-entry of UTU is given by uiuj. Since UTU=I, we must have ujuj=1 for all j=1,n and uiuj=0 for all ij. Therefore, the columns of U are pairwise orthogonal and each column has norm 1. We say that the columns of U are orthonormal. A vector in Rn having norm 1 is called a unit vector.

Examples

The identity matrix is trivially orthogonal. Here are two nontrivial orthogonal matrices: 12[1111], 19[744418481]

Orthogonal diagonalization

A real square matrix A is orthogonally diagonalizable if there exist an orthogonal matrix U and a diagonal matrix D such that A=UDUT. Orthogonalization is used quite extensively in certain statistical analyses.

An orthogonally diagonalizable matrix is necessarily symmetric. Indeed, (UDUT)T=(UT)TDTUT=UDUT since the transpose of a diagonal matrix is the matrix itself.

The amazing thing is that the converse is also true: Every real symmetric matrix is orthogonally diagonalizable. The proof of this is a bit tricky. However, for the case when all the eigenvalues are distinct, there is a rather straightforward proof which we now give.

First, we claim that if A is a real symmetric matrix and u and v are eigenvectors of A with distinct eigenvalues λ and γ, respectively, then uTv=0. To see this, observe that λuTv=(λu)Tv=(Au)Tv=uTATvuTAv=γuTv. Hence, if uTv0, then λ=γ, contradicting that they are distinct. This proves the claim.

Now, let ARn×n be symmmetric with distinct eigenvalues λ1,,λn. Then every eigenspace is spanned by a single vector; say ui for the eigenvalue λi, i=1,,n. We may assume that uiui=1 for i=1,,n. If not, simply replace ui with 1uiui. This step is called normalization.

Let U be an n×n matrix whose ith column is given by ui. Let D be the diagonal matrix with λi as the ith diagonal entry. Then, A=UDU1.

To complete the proof, it suffices to show that UT=U1. First, note that the ith diagonal entry of UTU is uiTui=uiui=1. Hence, all entries in the diagonal of UTU are 1.

Now, the (i,j)-entry of UTU, where ij, is given by uiTuj. As ui and uj are eigenvectors with different eigenvalues, we see that this uiTuj=0.

Thus, UTU=In. Since U is a square matrix, we have UT=U1.

The above proof shows that in the case when the eigenvalues are distinct, one can find an orthogonal diagonalization by first diagonalizing the matrix in the usual way, obtaining a diagonal matrix D and an invertible matrix P such that A=PDP1. Then normalizing each column of P to form the matrix U, we will have A=UDUT.

To see a proof of the general case, click here.

Quick Quiz

Exercises

Give an orthogonal diagonalization of A=[3223].