Probability & Statistics - Lecture 3
Wednesday, September 14, 2022
Let \(X\) be a real, non-negative random variable. Then for all \(t > 0\) \[ \P{X \geq t} \leq \frac{\E{X}}{t} \]
Drill: Prove Markov’s Inequality.
Let \(X\) be a real random variable. Then for all \(t > 0\) \[ \P{|X - \E{X}| \geq t} \leq \frac{\Var{X}}{t^2} \]
Drill: Prove Chebyshev’s Inequality.
Let \(\{X_i\}_{i=1}^N\) be independent and identically distributed (i.i.d.) with finite mean \(\mu = \E{X}\). Then for all \(\epsilon > 0\) \[ \lim_{N \to \infty} \P{\abs{\frac{1}{N} \sum_{i=1}^N X_i - \mu} \geq \epsilon} = 0 \]
When two independent random variables \(X, Y\) are added to make a new random variable \(Z = X + Y\), the pdf of \(Z\) is the convolution of those of \(X, Y\), i.e. \[ f_Z(z) = \int_x f_X(x) f_Y(z - x) dx \]
When the sums grow to be of many terms, computing so many convolutions can be inconvenient. Where have we seen this before…
The characteristic function \(\varphi_X\) of a random variable \(X\) is defined as \[ \varphi_X(t) = \E{e^{j t X}} \]
Because \(\E{e^{j t X}} = \int_x f_X(x) e^{j t x} dx\), the characteristic function is analogous to a Fourier transform of the pdf \(f_X\).
Drill: Show that for independent \(X, Y\) and \(Z = X + Y\) the characteristic function \(\varphi_Z(t) = \varphi_X(t) \varphi_Y(t)\) (convolution in one domain is multiplication in the other).