Let \(A\) be an \(n\times n\) real symmetric matrix. We prove that \(A\) is orthogonally diagonalizable by induction on the size of \(A\).
Clearly, if \(A\) is \(1\times 1\), we can simply set \(U = [1]\) and \(D= A\).
For the induction hypothesis, assume that every \((n-1)\times (n-1)\) real symmetric matrix is orthogonally diagonalizable.
Consider an \(n\times n\) real symmetric matrix \(A\). Let \(\lambda\) be an eigenvalue of \(A\) and \(v\) be an eigenvector with eigenvalue \(\lambda\). Set \(q_1 = \frac{1}{\|v\|}v\). Extend \(\{q_1\}\) to an orthonormal basis \(\{q_1,q_2,\ldots,q_n\}\) of \(\mathbb{R}^n\). Let \(Q = [ q_1 \cdots q_n]\). Then, \begin{eqnarray*} AQ & = & [\lambda q_1~ Aq_2 ~\cdots~ Aq_n] \\ & = & Q\begin{bmatrix} \lambda & d \\ 0 & A' \end{bmatrix} \end{eqnarray*} for some \(d \in \mathbb{R}^{1\times (n-1)}\) and \(A' \in \mathbb{R}^{(n-1)\times (n-1)}\). Since \(Q\) is orthonormal by construction, we obtain \[Q^\mathsf{T}A Q =\begin{bmatrix} \lambda & d \\ 0 & A' \end{bmatrix}.\] Taking the transpose of both sides gives \[Q^\mathsf{T}A^\mathsf{T} Q =\begin{bmatrix} \lambda & 0 \\ d^\mathsf{T} & A'^\mathsf{T} \end{bmatrix}.\] But \(A = A^\mathsf{T}\). Hence, \(Q^\mathsf{T}A^\mathsf{T}Q = Q^\mathsf{T} A Q\) and so \(d = 0\) and \(A' = A'^\mathsf{T}\).
Since \(A'\) is an \((n-1)\times (n-1)\) real symmetric matrix, by the induction hypothesis, there exist \(Q', D' \in \mathbb{R}^{(n-1)\times (n-1)}\) with \(Q'\) orthogonal and \(D'\) diagonal such that \(A' = Q'D'Q'^\mathsf{T}\).
Hence, \[Q^\mathsf{T}A Q =\begin{bmatrix} \lambda & 0 \\ 0 & Q'D'Q'^\mathsf{T} \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & Q'\end{bmatrix} \begin{bmatrix} \lambda & 0 \\ 0 & D' \end{bmatrix} \begin{bmatrix} 1 & 0 \\ 0 & Q'^\mathsf{T}\end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & Q'\end{bmatrix} \begin{bmatrix} \lambda & 0 \\ 0 & D' \end{bmatrix} \begin{bmatrix} 1 & 0 \\ 0 & Q'\end{bmatrix}^\mathsf{T} .\] It follows that \[ A = Q \begin{bmatrix} 1 & 0 \\ 0 & Q'\end{bmatrix} \begin{bmatrix} \lambda & 0 \\ 0 & D' \end{bmatrix} \begin{bmatrix} 1 & 0 \\ 0 & Q'\end{bmatrix}^\mathsf{T} Q^\mathsf{T}.\] Setting \(U = Q \begin{bmatrix} 1 & 0 \\ 0 & Q'\end{bmatrix}\) and \(D = \begin{bmatrix} \lambda & 0 \\ 0 & D' \end{bmatrix}\), we obtain \(A = U D U^\mathsf{T}\). It remains to show that \(U^\mathsf{T} U = I\). Indeed, \begin{eqnarray*} U^\mathsf{T} U & = & \begin{bmatrix} 1 & 0 \\ 0 & Q'\end{bmatrix}^\mathsf{T} Q^\mathsf{T} Q \begin{bmatrix} 1 & 0 \\ 0 & Q'\end{bmatrix} \\ & = & \begin{bmatrix} 1 & 0 \\ 0 & Q'\end{bmatrix}^\mathsf{T} \begin{bmatrix} 1 & 0 \\ 0 & Q'\end{bmatrix} \\ & = & \begin{bmatrix} 1 & 0 \\ 0 & Q'^\mathsf{T}\end{bmatrix} \begin{bmatrix} 1 & 0 \\ 0 & Q'\end{bmatrix} \\ & = & \begin{bmatrix} 1 & 0 \\ 0 & Q'^\mathsf{T} Q'\end{bmatrix} \\ & = & \begin{bmatrix} 1 & 0 \\ 0 & I_{n-1} \end{bmatrix} \\ & = & I_n \end{eqnarray*}