The Mathematics of ECE

Probabilities - MGF and Gaussian stuff

Wednesday, September 15, 2021

Probabilities: roadmap

  • Last time: estimation - given \(y\) what is \(x\)??

    • Key concepts: conditional expectation, minimum mean square estimation
    • Key result: orthogonality principle
    • Useful in signal processing, robotics, machine learning
  • Today:

    • Characteristic function
    • Random processes

Characteristic Function

  • \(\phi_X(t) = \mathbb{E}\left[e^{itX}\right]\)
  • The characteristic function is analogous to the Fourier transform of the pdf (with a sign change on the frequency term).
  • The addition of two independent random variables yields a new random variable with a pdf equal to the convolution of the original two.
    • \(Z = X + Y\)
    • \(p_Z(z) = \int_x p_X(x) p_Y(z - x) dx\)
  • Equivalently, for reasons similar to convolution in space is multiplication in frequency, we may multiply characteristic functions.
    • \(\phi_Z(t) = \phi_X(t) \phi_Y(t) dx\)
  • For \(X \sim \mathcal{N}(0, \sigma^2)\) the characteristic function is \(\phi_X(t) = e^{-\frac{1}{2} \sigma^2 t^2}\)

Moment Generating Function

  • \(M_X(t) = \mathbb{E}\left[e^{tX}\right]\)
  • The \(p\)th moment can be calculated using the \(p\)th derivative of the MGF at \(t=0\), assuming it exists
    • \(\mathbb{E}\left[X^p\right] = \frac{d^p M_X}{d t^p}(t=0)\)
  • For \(X \sim \mathcal{N}(0, \sigma^2)\) the MGF is \(M_X(t) = e^{\frac{1}{2} \sigma^2 t^2}\)
  • For \(X \sim \mathcal{N}(0, \sigma^2)\), every odd moment (\(p=1,3,5,\dots\)) is 0.

Random processes

  • A set of random variables \(\{X_t\}\) indexed by time \(t\).
  • The autocovariance between two points in a random process is \(K_X(s, t) = \mathbb{E}\left[(X_s - \mathbb{E}\left[X_s\right])(X_t - \mathbb{E}\left[X_t\right])\right] = \mathbb{E}\left[X_s X_t\right] - \mathbb{E}\left[X_s\right]\mathbb{E}\left[X_t\right]\)
  • A random process with the following properties.
    • \(\forall t ~:~ \mathbb{E}\left[X_t\right] = \mu\)
    • \(\forall s,t ~:~ K_X(s, t) = K_X(s - t, 0)\)
    • \(\forall t ~:~ \mathbb{E}\left[|X_t|^2\right] < \infty\)