# Tuesday: November 5 ## Hilbert Spaces > See notes on the webpage. **Definition**: An *inner product* on a vector space satisfies - $\inner{ax + by}{z} = a \inner{x}{z} + b\inner{y}{z}$ i.e., for all fixed $z\in V$, the map $x \mapsto \inner{x}{z}$ is a *linear functional*. - $\inner{x}{y} = \overline{\inner{y}{x}}$ - $\inner{x}{x} \in (0, \infty)$ This induces a *norm*, $\norm{x} = \inner{x}{x}^{1/2}$. **Proposition 1:** The map $x \mapsto \norm{x}$ does in fact define a norm. > The key to establishing this is the triangle inequality, since many of the other necessary properties fall out easily. We'll need the **Cauchy-Schwarz inequality**, i.e. $$ \abs{\inner{x}{y}} \leq \norm{x} \norm{y}, $$ with equality iff $x = \lambda y$. > Note that this relates an inner product to a norm, as opposed to other inequalities which relates norms to other norms. **A useful computation**: \begin{align*} \norm{x + y}^2 &= \inner{x+y}{x+y} \\ &= \norm{x}^2 + 2\mathrm{Re}\inner{x}{y} + \norm{y}^2 \\ &\leq \norm{x}^2 + \abs{\inner{x}{y}} + \norm{y}^2 \\ &\leq \norm{x}^2 + 2\norm{x}\norm{y} + \norm{y}^2 \quad\quad\text{by Schwarz} \\ &= (\norm{x} + \norm{y})^2 .\end{align*} **Definition:** An inner product space that is *complete* with respect to $\norm{\wait}$ induced from its inner product is a *Hilbert space*. > Recall that a Banach space is a complete *normed* space. *Examples:* - $$ \CC^n \text{ with } \inner{x}{y} = \sum x_j \overline{y_j} .$$ - $$ \ell^2(\NN) \text{ with } \inner{x}{y} = \sum_{j=1}^\infty x_j y_j .$$ Note that this is finite by **AMGM**, since by assumption $$ \sum x_i y_i \leq \frac 1 2 (\sum x_i + \sum y_i) < \infty .$$ - $$ L^2(\RR^n) \text{ with } \inner{f}{g} = \int f \overline{g} .$$ This is also finite because $$ \int \abs{f\overline g} \leq \frac 1 2 (\int f + \int g) .$$ **Proof of Schwarz Inequality:** If $x= \lambda y$ for some $\lambda \in \CC$, we have equality since $$ \inner{x}{y} = \inner{\lambda y}{y} = \abs{\lambda} \norm{y}^2 = \norm{x}\norm{y} .$$ So we can assume $x-\lambda y \neq 0$ for *any* $\lambda \in \CC$, so $$ \inner{x-\lambda y} {x-\lambda y} > 0 .$$ We thus have $$ 0 < \inner{x-\lambda y} {x-\lambda y} = \norm{x}^2 - 2\overline{\lambda} \mathrm{Re}\inner{x}{y} + \abs{\lambda}^2 \norm{y} .$$ Now let $\lambda = tu$ where $t\in \RR$ and $u =\inner{x}{y} / \abs{\inner{x}{y}}$. Then we get $0 < \norm{x}^2 - 2t\abs{\inner{x}{y}} + t^2 \norm{y}^2$ But this is quadratic in $t$ and doesn't have a real root, so its discriminant must be negative. Thus $$ 4\abs{\inner{x}{y}}^2 - 4\norm{y}^2 \norm{x}^2 < 0 ,$$ which yields Cauchy-Schwarz. $\qed$ ## Continuity of Norm and Inner Product **Application of the Schwarz Inequality:** If $x_n \to x$ in $V$, i.e. $\norm{x_n - x} \to 0$, and similarly $y_n \to y$, we have $\inner{x_n}{y_n} \to \inner{x}{y}$ in $\CC$. *Proof:* We have \begin{align*} \abs{ \inner{x_n}{y_n} - \inner{x}{y} } &= \abs{ \inner{x_n - x}{y} + \inner{x}{y_n - y} } \\ &\leq \abs{\inner{x_n - x}{y}} + \abs{\inner{x}{y_n - y}} \\ &\leq \norm{x_n - x}\norm{y} + \norm{x} \norm{y_n - y}\quad\quad\text{by Schwarz} \\ & \to 0 .\end{align*} $\qed$ *Exercise*: Show $\norm{y_n - y} \to 0$ iff $\norm{y_n} \to \norm{y}$. **Proposition (Parallelogram Law):** Let $H$ be an inner product space, then $$ \norm{x+y}^2 + \norm{x-y}^2 = 2(\norm{x}^2 + \norm{y}^2) $$ *Exercise*: Prove using parallelogram diagram. *Proof:* Use the fact that $$ \norm{x \pm y}^2 + \norm{x}^2 \pm 2\mathrm{Re}\inner{x}{y} + \norm{y}^2 ,$$ so just add and the cross-terms will cancel. **Proposition (Pythagorean Theorem):** $$ \inner{x}{y} = 0 \implies \norm{x+y}^2 = \norm{x}^2 + \norm{y}^2. $$ In this situation, we say $x,y$ are *orthogonal*. **Corollary:** If $\theset{x_i}$ are all pairwise orthogonal, then $$ \norm{\sum x_i}^2 = \sum \norm{x_i}^2 .$$ ## Orthonormal Sets: **Definition:** A countable collection $\theset{u_n}$ is *orthonormal* iff 1. $\inner{u_j}{u_k} = 0$ for $j\neq k$, and 2. $\inner{u_j}{u_j} = \norm{u_j}^2 = 1$ for all $j$. > Note: we only consider countable collections; a *separable* Hilbert space always has such a basis. **Definition:** We say $\theset{u_n}$ is an orthonormal *basis* for $H$ if $\mathrm{span} \theset{u_n}$ (i.e. *finite* linear combinations of $u_n$) is dense in $H$. ## Best Approximation and Bessel's Inequality **Theorem:** Let $\theset{u_n}$ be a countable orthonormal basis of $H$. Then for any $x\in H$, the *best approximation* to $x$ by a sum $\sum_{n=1}^N a_n u_n$ when $a_n = \inner{x}{u_n}$. > Note: these $a_n$ will be Fourier coefficients later! *Proof:* \begin{align*} \norm{x - \sum a_n u_n}^2 &= \norm{x}^2 - 2\mathrm{Re} \sum \inner{x}{u_n}a_n + \sum \abs{a_n}^2 \\ &= \norm{x}^2 - \sum \abs{\inner{x}{u_n}}^2 + \sum\left( \abs{\inner{x}{u_n}}^2 - 2\mathrm{Re}\inner{x}{u_n}a_n + \abs{a_n}^2 \right) \\ &\leq \norm{x}^2 - \sum \abs{\inner{x}{u_n}}^2 + \abs{\inner{x}{u_n} - a_n }^2 &\geq 0 ,\end{align*} where equality is attained iff $a_n = \inner{x}{u_n} = a_n$. So this is the best approximation. > Note: Equalities are somehow easier to show -- they necessarily involve direct computations. But then $$ 0 \leq \norm{x - \sum \inner{x}{u_n} u_n}^2 = \norm{x}^2 - \sum \abs{\inner{x}{u_n}}^2, $$ so $\sum \abs{\inner{x}{u_n}}^2 \leq \norm{x}^2$ holds for every $N$, and thus for the infinite sum, which is **Bessel's inequality**. > If this is an equality, then this is exactly **Parseval's theorem**. ## Riesz-Fischer **Theorem (Riesz-Fischer):** 1. The map $$ x \mapsvia{ \hat{} } \inner{x}{u_n} \definedas \hat{x}(n) $$ maps $H$ onto $\ell^2$ surjectively. 2. If $\theset{u_n}^\infty$ is orthonormal in $H$ and $\theset{a_n}^\infty \in \ell^2(\NN)$, then there exists an $x\in H$ such that $\inner{x}{u_n} = a_n$ for all $n\in N$. 3. $x$ can be chosen such that $\norm{x} = \sqrt{\sum \abs{a_n}^2}$. *Remarks:* This is not a bijection: there may not be a unique such $x$. The $a_n$ are referred to as the *Fourier coefficients*. If $$ a_n = 0 \text{ for all } n \implies x=0 ,$$ then the set $\theset{u_n}$ is said to be **complete**. This turns out to be equivalent to $\theset{u_n}$ being a *basis*, which is equivalent to the convergence of Fourier series.