Given two tuples \(u,v\in\mathbb{R}^n\), the **dot product**
of \(u\) and \(v\), denoted by \(u\cdot v\), is defined to be the
quantity \[u_1v_1 + u_2v_2 + \cdots + u_nv_n.\]

For example, if \(u = \begin{bmatrix} 1 \\2 \\3\end{bmatrix}\) and \(v = \begin{bmatrix} 4 \\ 5 \\ 6\end{bmatrix}\), then \[u\cdot v = (1)(4)+ (2)(5) + (3)(6) = 4 + 10 + 18 = 32.\]

(Notice that the dot notation is overloaded and is used for multiplication in a field as well as the dot product of two tuples. As confusing as it might seem, the context will make it clear which is intended.)

We say that \(u, v\in \mathbb{R}^n\) are **orthogonal** if
\(u\cdot v = 0\).

When \(n = 2\) or \(n = 3\), if neither \(u\) nor \(v\) is the zero vector, then \(u \cdot v = 0\) if and only if the arrows representing \(u\) and \(v\) form a right angle. In fact, a bit more is known.

Let \(\theta\) denote the value of the smaller angle between the arrows representing \(u\) and \(v\). (We often refer to this angle simply as the angle between \(u\) and \(v\).) Then \[\cos\theta = \displaystyle\frac{u\cdot v}{\sqrt{u \cdot u}\sqrt{v \cdot v}}.\] Clearly, when \(u \cdot v = 0\), \(\theta = \frac{\pi}{2}\) radians, the measure of a right angle.

One often writes \(\|u\|\) for \(\sqrt{u\cdot u}\).
When \(n = 2\) or \(n = 3\), one can see from the Pythagorean Theorem
that \(\|u\|\) gives the length of the arrow representing \(u\).
For general \(n\), \(\|u\|\) is called the **norm** of \(\|u\|\)
and is the quantity \(\sqrt{u_1^2 + u_2^2 + \cdots + u_n^2}\).

A vector having norm \(1\) is called a **unit vector**.

**Remark**:
One can in fact show that for all integers \(n \geq 1\) and
\(u,v\in\mathbb{R}^n\),
\[|u\cdot v| \leq \|u\|\|v\|\]
This inequality is known as the Cauchy-Schwarz Inequality.
Hence,
\(\displaystyle\frac{|u\cdot v|}{\|u\|\|v\|}\leq 1\).
With this, one can
*define* the cosine of the angle between two
vectors \(u,v\in\mathbb{R}^n\)
as \(\displaystyle\frac{u\cdot v}{\|u\|\|v\|}\) even
for \(n \geq 4\). Clearly, one cannot visualize an angle beyond three
dimensions. However, the terminology is retained as an analogue for
higher dimensions.

Let \(u = \begin{bmatrix} 2 \\ 0\end{bmatrix}\) and \(v = \begin{bmatrix} 0 \\ -1 \end{bmatrix}\). Clearly, the angle between \(u\) and \(v\), call it \(\theta\), is \(\frac{\pi}{2}\) radians. (Recall that we take the smaller angle.) Thus \(\cos \theta = 0.\) One can also obtain this by applying the formula above: \[\cos\theta = \frac{u\cdot v}{\|u\|\|v\|} = \frac{ (2)(0) + (0)(-1) } {\|u\| \|v\|} = 0.\]

Let \(u = \begin{bmatrix} 3 \\ 3\end{bmatrix}\) and \(v = \begin{bmatrix} 0 \\ 2 \end{bmatrix}\). Clearly, the angle between \(u\) and \(v\), call it \(\theta\), is \(\frac{\pi}{4}\) radians. Thus \(\cos \theta = 1/\sqrt{2}.\) One can also obtain this by applying the formula above: \[\cos\theta = \frac{u\cdot v}{\|u\| \|v\|} = \frac{ (3)(0) + (3)(2) } {\sqrt{3^2+3^2}\sqrt{0^2+2^2}} = \frac{ 6 } {\sqrt{18}\sqrt{4}} = \frac{ 6 } {6\sqrt{2}} = \frac{ 1 } {\sqrt{2}}.\]

If \(u,v\in\mathbb{R}^n\), then \(u^\mathsf{T} v\) technically is a \(1\times 1\) matrix whose entry is \(u\cdot v\). However, one often for convenience regards a \(1\times 1\) matrix as the entry that it contains. In other words, one often writes \(u^\mathsf{T} v\) and \(u \cdot v\) interchangeably. The context usually makes it clear if \(u^\mathsf{T}v\) refers to a number or a matrix.

\(u = \begin{bmatrix} 2 \\ 3\end{bmatrix}\) and \(v = \begin{bmatrix} 4 \\ -1\end{bmatrix}\).

\(u = \begin{bmatrix} 1 \\ 1 \\ -1 \end{bmatrix}\) and \(v = \begin{bmatrix} 1 \\ -1 \\ 1 \end{bmatrix}\).

\(u = \begin{bmatrix} 1 \\ 0 \\ 0 \\ 1\end{bmatrix}\) and \(v = \begin{bmatrix} 0 \\ 1 \\ 1 \\ 0\end{bmatrix}\).