Dr. Matthieu R Bloch

Wednesday September 08, 2021

**Assigment 2**- Posted Wednesday September 08, 2021
- Due Tuesday September 14, 2021

**Assignment 3**- Posted Tuesday September 14, 2021
- Due Monday September 20, 2021

**Midterm 1**- Will make decision regarding modality by next Wednesday.
- Moved to take home exam if Surveillance Incidence Positivity >1
- https://health.gatech.edu/coronavirus/health-alerts

**Last time**:- normed vector spaces, inner product vector spaces (pre-Hilbert)
**Key take-way:**we can’t do geometry without an inner product**Key concepts:**norm, inner product**Questions?**

**Today:**Hilbert spaces- More on vector spaces: Cauchy-Schwartz, orthogonality, angles
*Orthogonal projections*

**Monday September 13, 2021**: orthobases (and non orthogonal ones)

In an inner product space, an inner product induces a norm \(\norm{x} \eqdef \sqrt{\dotp{x}{x}}\)

A norm \(\norm{\cdot}\) is induced by an inner product on \(\calV\) iff \(\forall x,y\in\calV\) \(\norm{x}^2+\norm{y}^2 = \frac{1}{2}\left(\norm{x+y}^2+\norm{x-y}^2\right)\) If this is the case, the inner product is given by the

*polarization identity*\[\dotp{x}{y}=\frac{1}{2}\left(\norm{x}^2+\norm{y}^2-\norm{x-y}^2\right)\]**Induced**norm have some nice additional propertiesAn induced norm satisfies \(\forall x,y\in\calV\) \(\norm{x+y}\leq \norm{x}+\norm{y}\)

An inner product satisfies \(\forall x,y\in\calV\) \(\dotp{x}{y}^2\leq\dotp{x}{x}\dotp{y}{y}\)

- In the following \(\calV\) is an inner product space with induced norm \(\norm{\cdot}\)
- The angle between two non-zero vectors \(x,y\in\calV\) is \[ \cos\theta \eqdef \frac{\dotp{x}{y}}{\norm{x}\norm{y}} \]
Two vectors \(x,y\in\calV\) are

A vector \(x\in\calV\) is orthogonal to a set \(\calS\subset\calV\) if \(\forall s\in\calS\) \(\dotp{x}{s}=0\). We write \(x\perp \calS\) for simplicity.*orthogonal*if \(\dotp{x}{y}=0\). We write \(x\perp y\) for simplicity.- If \(x\perp y\) then \(\norm{x+y}^2=\norm{x}^2+\norm{y}^2\)
- Inner product spaces have almost all the properties of \(\bbR^n\)!

In infinite dimensions, things are a little bit tricky. What does the following mean? \[ x(t) = \sum_{n=1}^\infty \alpha_n\psi_n(t) \]

We need to define a notion of

*convergence*, e.g., \[ \lim_{N\to\infty}\norm{x(t)-\sum_{n=1}^N \alpha_n\psi_n(t)}=0 \]Problems can still arise if “points are missing”; we avoid this by introducing the notion of

*completeness*- A inner product space \(\calV\) is complete if every Cauchy sequence converges, i.e., for every \(\set{x_i}_{i\geq1}\) in \(\calV\) \[ \lim_{\min(m,n)\to\infty}\norm{x_m-x_n}=0\Rightarrow \lim_{n\to\infty}x_n = x^*\in\calV. \]
We won’t worry too much about

*proving*that spaces are completeA complete normed vector space is a

**Banach space**; a complete inner product space is a**Hilbert space**

Let \(\calH\) be a Hilbert space with induced norm \(\dotp{\cdot}{\cdot}\) and induced norm \(\norm{\cdot}\) ; let \(\calT\) be subspace of \(\calH\)

For \(x\in\calH\), what is the closest point of \(\hat{x}\in\calT\)? How do we solve

\[ \min_{y\in\calT}\norm{x-y} \]

This problem has a unique solution given by the orthogonality principle

Let \(\calX\) be a

*pre-Hilbert*space, \(\calT\) be a subspace of \(\calX\), and \(x\in\calX\).- If there exists a vector \(m^*\in\calT\) such that \(\forall m\in\calT\) \(\norm{x-m^*}\leq \norm{x-m}\), then \(m^*\) is unique.
- A necessary and sufficient condition for \(m^*\in\calT\) to be a unique minimizer is that the error \(x-m^*\) be orthogonal to \(\calT\).

This doesn’t say that \(m^*\) exists!

Let \(\calH\) be a *Hilbert* space, \(\calT\) be a *closed* subspace of \(\calX\), and \(x\in\calX\).

- There exists a unique vector \(m^*\in\calT\) such that \(\forall m\in\calT\) \(\norm{x-m^*}\leq \norm{x-m}\).
- A necessary and sufficient condition for \(m^*\in\calT\) to be a unique minimizer is that the error \(x-m^*\) be orthogonal to \(\calT\)

The orthogonality principle gives us a procedure for computing the closest point

Let \(\calH\) be a

*Hilbert*space, \(\calT\) be a subspace of \(\calH\) with dimension \(n\), and \(x\in\calH\). Let \(\set{e_i}_{i=1}^n\) be a basis for \(\calT\). Then the projection \(\hat{x}\) of \(x\) onto \(\calT\) is \[ \hat{x} = \sum_{i=1}^n\alpha_i e_i \] where \(\bfalpha\eqdef\mat{c}{\alpha_1&\cdots&\alpha_n}^\intercal\) is the solution of \(\bfG\bfalpha=\bfb\) with \(\bfG\) the Grammiam matrix of the basis and \(\bfb\) the coordinates of \(x\) on the basis.