Probabilities - Estimation
Wednesday, September 08, 2021
This is nice, it looks simpleā¦ but we need to know \(p_{xy}\)
That could be hard in practice
For simplicity, we might want to restrict ourselves to linear estimates of the form \(\hat{x}\eqdef K_0 y\).
If \(R_y>0\), we have \(K_0=R_{xy}R_y^{-1}\)
We only need to know second order statistics of \(x\) and \(y\)
Question: how do we deal with non-zero mean?
The LMS solution is such that \(K_0R_y = R_{xy}\), equivalently \[ \E{(x-K_0y)y^\intercal} = 0 \]
This could be viewed as an orthogonality condition (linear algebra!!!!)
For centered (zero-mean) random variables, define \(\dotp{x}{y}\eqdef{\E{xy^\intercal}}\)
The LLMS estimate of \(x\) given \(y\) is characterized by the fact that the error \(\tilde{x}\eqdef x-\hat{x}\) is orthogonal (uncorrelated) to the observation \(y\). Equivalently, the LLMS estimate is the projection of \(x\) onto the linear space spanned by \(y\).