Key concepts: conditional probability and
independence, random variables, distributions
Next workshop: Wednesday September 14,
2022
Expectations
The expectation
of a function of a random
variable is defined by it’s
distribution .
(Technicality: law of the unconscious
statistician)
Properties:
(law of total expectation)
The mean of a random
variable is defined as .
The variance of a
random variable is defined as
.
Drill: Prove that the variance may also
be represented as .
Drill: Recall the uniform distribution
has pdf defined on by . For a uniformly distributed random variable , compute and .
Estimation (MMSE)
Problem: We want to learn the
realization of some random vector from a dependent random vector .
A common model for this situation is where is a constant,
known matrix and is some random
noise.
The mean squared error (MSE) of a deterministic estimator of random
variable from
observation is defined
as The minimum mean square estimator (MMSE)
is then
defined as with solution
Estimation (MMSE)
If one can compute for arbitrary , then we have our ideal (in the MMSE
sense) estimator.
Example: is a zero-mean Gaussian random vector with (block)
covariance matrix then the MMSE is
When we are not so lucky, what can we do…
Estimation (LMMSE)
A linear estimator is one of the form The parameters of the linear minimum mean
square estimator (LMMSE) is then defined as
with solution satisfying
where and
Drill: Further restrict the linear model
above to be of the form (equivalently, set ). Prove that the
matrix which solves the LMMSE problem in this case indeed satisfies
Estimation (LMMSE)
The LMMSE problem could have been a linear algebra
problem…
Random variables form their own vector space
The natural inner product on this vector space is
.
LLMSE problem becomes a least squares problem, can
be solved with orthogonality principle.