Dr. Matthieu R Bloch
Wednesday October 06, 2021
Assignment 4 assigned Tuesday, October 5, 2021
Includes a (small) programming component
Due October 14, 2021 (soft deadline, hard deadline on October 16)
Last time: Least-square regression
Today
Solving linear least-square regression
Extension to infinite dimension
Reading: Romberg, lecture notes 8
Facts: for any matrix \(\bfA\in\bbR^{m\times n}\)
\(\ker{\bfA^\intercal\bfA}=\ker{\bfA}\)
\(\text{col}(\bfA^\intercal\bfA)=\text{row}(\bfA)\)
\(\text{row}(\bfA)\) and \(\ker{\bfA}\) are orthogonal complements
We can say a lot more about the normal equations
In machine learning, there are often infinitely many solutions
One reasonable to choose a solution among infinitely many is the minimum energy principle \[ \min_{\bftheta\in\bbR^d}\norm[2]{\bftheta}^2\text{ such that } \bfX^\intercal\bfX\bftheta = \bfX^\intercal\bfy \]
For now, assume that \(\textsf{rank}(\bfX)=d\), so that the problem becomes \[ \min_{\bftheta\in\bbR^d}\norm[2]{\bftheta}^2\text{ such that } \bfX\bftheta = \bfy \]