Prof. Matthieu Bloch
Monday, October 7, 2024 (v1.0)
An inner product kernel is a mapping \(k:\bbR^d\times\bbR^d\to\bbR\) for which there exists a Hilbert space \(\calH\) and a mapping \(\Phi:\bbR^d\to\calH\) such that \[\forall \bfu,\bfv\in\bbR^d\quad k(\bfu,\bfv)=\langle\Phi(\bfu),\Phi(\bfv)\rangle_\calH\]
A function \(k:\bbR^d\times\bbR^d\to\bbR\) is a positive semidefinite kernel if
A function \(k:\bbR^d\times\bbR^d\to\bbR\) is an inner product kernel if and only if \(k\) is a positive semidefinite kernel.
Regression using linear and quadratic functions in \(\bbR^d\)
Regression using Radial Basis Functions
Least square problems involved the normal equations \(\bfX^\intercal\bfX \bftheta=\bfX^\intercal\bfy\)
This is a system of symmetric equations \(\bfA\bfx=\bfy\) with \(\bfA^\intercal=\bfA\)
A real-valued matrix \(\bfA\) is symmetric if \(\bfA^\intercal=\bfA\)
A complex-valued matrix \(\bfA\) is Hermitian if \(\bfA^\dagger=\bfA\) (also written \(\bfA^H=\bfA\))
Given a matrix \(\matA\in\bbC^{n\times n}\), if a vector \(\vecv\in\bbC^n\) satisfies \(\matA\bfv=\lambda\bfv\) for some \(\lambda\in\bbC\), then \(\lambda\) is an eigenvalue associated to the eigenvector \(\bfv\).
Consider the canonical basis \(\set{e_i}_{i=1}^n\) for \(\bbR^n\); every vector can be viewed as coefficients \(\set{\alpha_i}_{i=1}^n\), \[ \bfx = \sum_{i=1}^n \alpha_i e_i = \mat{cccc}{\alpha_1&\alpha_2&\cdots&\alpha_n}^\intercal \]
How do we find the representation of \(\bfx\) in another basis \(\set{v_i}_{i=1}^n\)? Write \(e_i=\sum_{j=1}^n\beta_{ij}v_j\)
Regroup the coefficients \[ \bfx = \cdots + \left(\sum_{i=1}^n\beta_{ij}\alpha_i\right) v_j + \cdots \]
In matrix form \[ \bfx_{\text{new}} = \mat{cccc}{\beta_{11}&\beta_{21}&\cdots&\beta_{n1}\\ \beta_{12}&\beta_{22}&\cdots&\beta_{n2}\\\vdots&\vdots&\vdots&\vdots\\\beta_{1n}&\beta_{2n}&\cdots&\beta_{nn}}\bfx \]
If \(\matA,\bfB\in\bbR^{n\times n}\) then \(\bfB\) is similar to \(\bfA\) if there exists an invertible matrix \(\bfS\in\bbR^{n\times n}\) such that \(\bfB=\bfP^{-1}\bfA\bfP\)
\(\matA\in\bbR^{n\times n}\) is diagonalizable if it is similar to a diagonal matrix, i.e., there exists an invertible matrix \(\bfP\in\bbR^{n\times n}\) such that \(\bfD=\bfP^{-1}\bfA\bfP\) with \(\matD\) diagonal
Every complex matrix \(\matA\) has at least one complex eigenvector and every real symmetrix matrix has real eigenvalues and at least one real eigenvector.
Every matrix \(\matA\in\bbC^{n\times n}\) is unitarily similar to an upper triangular matrix, i.e., \[ \bfA = \bfV\boldsymbol{\Delta}\bfV^\dagger \] with \(\boldsymbol{\Delta}\) upper triangular and \(\bfV^\dagger=\bfV^{-1}\).
Every hermitian matrix is unitarily similar to a real-valued diagonal matrix.
A symmetric matrice \(\matA\) is positive definite if it has positive eigenvalues, i.e., \(\forall i\in\set{1,\cdots,n}\quad\lambda_i>0\).
A symmetric matrice \(\matA\) is positive semidefinite if it has nonnegative eigenvalues, i.e., \(\forall i\in\set{1,\cdots,n}\quad\lambda_i\geq 0\).
For any analytic function \(f\), we have \[ f(\matA) = \sum_{i=1}^n f(\lambda_i)\vecv_i\vecv_i^\intercal \]
Let \(\set{\vecv_i}\) be the eigenvectors of \(\matA\). \[ \vecx = \sum_{i=1}^n\frac{1}{\lambda_i}\dotp{\vecy}{\vecv_i}\vecv_i \]
\[ \frac{1}{\lambda_1^2}\norm{\vece}^2\leq \norm[2]{\vecx-\tilde{\vecx}}\leq \frac{1}{\lambda_n^2}\norm{\vece}^2. \]