Deterministic Least Squares

Dr. Matthieu R Bloch

Thursday August 25, 2022

Today in ECE 6555

  • Don’t forget
    1. Read the syllabus (for real!) and ask for clarifications as needed
    2. Register on Piazza
    3. Check Gradescope access
    4. Check out the self assessment (ungraded, see canvas later today)
  • Announcements
    1. Mathematics of ECE workshops (more on this soon)
    2. Problem set 1 posted (reinforcing linear algebra, formal thinking, and extending some of what we’ve seen in class)
  • Today’s plan
    • Geometry in (finite) dimensional spaces
    • Normal equations
    • Deterministic least squares
    • Why? Geometric intuition is incredibly powerful and will carry over to more complex settings
  • Questions?

Inner product and norm

  • An inner product space over \(\bbR\) is a vector space \(\calV\) equipped with a positive definite symmetric bilinear form \(\dotp{\cdot}{\cdot}:\calV\times\calV\to\bbR\) called an inner product

  • An inner product space is also called a pre-Hilbert space

  • An inner product satisfies \(\forall x,y\in\calV\) \(\dotp{x}{y}^2\leq\dotp{x}{x}\dotp{y}{y}\)

  • A norm on a vector space \(\calV\) over \(\bbR\) is a function \(\norm{\cdot}:\calV\to\bbR\) that satisfies:
    • Positive definiteness: \(\forall x\in\calV\) \(\norm{x}\geq 0\) with equality iff \(x=0\)
    • Homogeneity: \(\forall x\in\calV\) \(\forall\alpha\in\bbR\) \(\norm{\alpha x}=\abs{\alpha}\norm{x}\)
    • Subadditivity: \(\forall x,y\in\calV\) \(\norm{x+y}\leq \norm{x}+\norm{y}\)
  • \[\bfx\in\bbR^d\qquad\norm[0]{\bfx}\eqdef\card{\set{i:x_i\neq 0}}\quad\norm[1]{\bfx}\eqdef\sum_{i=1}^d\abs{x_i}\quad \norm[2]{\bfx}\eqdef\sqrt{\sum_{i=1}^d x_i^2}\]

Induced norm

  • In an inner product space, an inner product induces a norm \(\norm{x} \eqdef \sqrt{\dotp{x}{x}}\)
  • A norm \(\norm{\cdot}\) is induced by an inner product on \(\calV\) iff \(\forall x,y\in\calV\) \[\norm{x}^2+\norm{y}^2 = \frac{1}{2}\left(\norm{x+y}^2+\norm{x-y}^2\right)\]

    If this is the case, the inner product is given by the polarization identity \[\dotp{x}{y}=\frac{1}{2}\left(\norm{x}^2+\norm{y}^2-\norm{x-y}^2\right)\]

Orthogonality

    • Two vectors \(x,y\in\calV\) are orthogonal if \(\dotp{x}{y}=0\). We write \(x\perp y\) for simplicity.
    • A vector \(x\in\calV\) is orthogonal to a set \(\calS\subset\calV\) if \(\forall s\in\calS\) \(\dotp{x}{s}=0\). We write \(x\perp \calS\) for simplicity.
    • Two vector space \(\calV\) and \(\calW\) are orthogonal if \(\forall v\in\calV\) and \(\forall w\in\calW\) we have \(v\perp w\).
  • If \(x\perp y\) then \(\norm{x+y}^2=\norm{x}^2+\norm{y}^2\)

  • The orthogonal complement of vector space \(\calW\subset\calV\subset\bbR^n\) is \[ \calW^\perp\eqdef \set{x\in\calV:\dotp{x}{y}=0 \forall y\in\calW} \]

  • Homework problem: check that \(\calW^\perp\) is a vector space, \((\calW^\perp)^\perp=\calW\)

  • \[ \text{Ker}(\matH) = \text{Im}(\matH^T)^\perp\quad\text{Im}(\matH) = \text{Ker}(\matH^T)^\perp \]

Direct sums

  • Consider vectors subspaces \(\calU,\calV,\calW\) or \(\bbR^n\). Then \(\calW=\calU\oplus\calV\) iff for every \(w\in\calW\) there exists a unique pair \((u,v)\in\calU\times\calV\) such that \(w=u+v\)

  • \[ \text{Ker}(\matH)\oplus\text{Im}(\matH^\intercal) = \bbR^n\qquad \text{Ker}(\matH^\intercal)\oplus\text{Im}(\matH) = \bbR^m \]

  • Homework problem: prove that \(\calW\) and \(\calW^\perp\) are in direct sum

  • That’s about it for our review our linear algebra for now!