Prof. Matthieu Bloch
Thursday, September 7, 2023
Let \(X, Y\) be two random variables with joint PMF \(P_{XY}\). The mutual information between \(X\) and \(Y\) is \[ \bbI(X;Y)\eqdef \bbH(X)-\bbH(X|Y)=\bbD(P_{XY}\Vert P_XP_Y) \eqdef \bbH(Y)-\bbH(Y|X). \]
If \(\Delta(\calX\to\calY)\) denotes the set of kernels from \(\calX\) to \(\calY\) then the function \[ I:\Delta(\calX)\times \Delta(\calX\to\calY):(P,W)\mapsto I(P;W)\eqdef \mathbb{D}(W\cdot P\Vert (W\circ P)P) \] is a concave function of \(P\) and a convex function of \(W\).
Let \(X, Y, Z\) be discrete random variables with joint distribution \(p_{XYZ}\). The conditional mutual information between \(X\) and \(Y\) given \(Z\) is \(\mathbb{I}(X ; Y|Z) \eqdef \mathbb{H}(X|Z) - \mathbb{H}(X|YZ)\).
\(\mathbb{I}(X;Y|Z)\geq 0\) if and only if \(X\) and \(Y\) are conditionally independent given \(Z\).
Let \(\bfX\eqdef\set{X_i}_{i=1}^n\), \(Y\), and \(Z\) be jointly distributed random variables. Then,
\[\begin{align*} \mathbb{I}(\bfX;Y|Z) = \sum_{i=1}^n\mathbb{I}(X_i;Y|Z\bfX^{1:i-1})\textsf{ with the convention }\bfX^{1:0}=\emptyset. \end{align*}\]
Let \(X\), \(Y\), and \(Z\) be discrete random variables such that \(X - Y - Z\). Then \(\mathbb{I}(X;Y) \geq \mathbb{I}(X;Z)\) or, equivalently, \(\mathbb{H}(X|Z) \geq \mathbb{H}(X|Y)\).
Let \(X\) be a discrete random variable with alphabet \({\calX}\). Let \(\widehat{X}\) be an estimate of \(X\), \(\widehat{X} \in {\calX}\) and with joint distribution \(p_{X\widehat{X}}\). We define the probability of estimation error: \(P_e \eqdef \mathbb{P}[X\neq\widehat{X}]\). Then, \(\mathbb{H}(X|\widehat{X}) \leq \mathbb{H}_b(P_e) + P_e \log(\card{\calX}-1)\).
Let \(X\in\calX\) be a discrete random variable. The Renyi entropy of order \(\alpha\in\mathbb{R}\cup\{\infty\}\) of \(X\) is
\[\begin{align*} H_{\alpha}(X)\eqdef \frac{1}{1-\alpha}\log \sum_{x\in\calX} p_X(x)^\alpha, \end{align*}\]
with the convention that \(H_1(X)=\lim_{\alpha\rightarrow 1} H_\alpha(X)\) and \(H_\infty(X)=\lim_{\alpha\rightarrow \infty} H_\alpha(X)\).
Let \(X\in\calX\) be a discrete random variable. Then,
Let \(X\in\calX\) and \(Y\in\calY\) be discrete random variables. Then,
Let \(X,Y,Z\) be three discrete random variables. Then,
Let \(X,Y,Z\) be three discrete random variables. Then, \(\forall \alpha\in\mathbb{R}^+\cup\{\infty\}\) \(H^A_{\alpha}(X|YZ)\leq H^A_{\alpha}(X|Y)\). If \(\alpha\notin\{0,\infty\}\), equality holds if and only if \(X-Y-Z\) forms a Markov chain.
Let \(X,Y,Z\) be three discrete random variables such that \(X-Y-Z\) forms a Markov chain. Then, \(\forall\alpha\in\mathbb{R}^+\cup\{\infty\}\) \(H_\alpha(X|Y)\leq H_\alpha(X|Z)\).
Let \(X,Y,Z\) be three discrete random variables. Then, \(\forall \alpha\in\mathbb{R}^+\cup\{\infty\}\)
\[\begin{align*} H^A_{\alpha}(XY|Z)\geq H^A_\alpha(XY|Z)-H^A_0(Y|Z) \geq H^A_\alpha(X|Z)-H^A_0(Y|Z)\geq H^A_\alpha(X|Z)-\log\card{\calY}. \end{align*}\]
Let \(X\in\calX\) be a discrete random variable and let \(\epsilon\geq 0\). The \(ε\)-smooth Renyi entropy of order \(\alpha\in\mathbb{R}\cup\{\infty\}\) of \(X\) is
\[\begin{align*} H^\epsilon_{\alpha}(X)\eqdef \frac{1}{1-\alpha}\inf_{Q\in\calB^\epsilon(P_X)}\log \sum_{x\in\calX} Q(x)^\alpha, \end{align*}\]
with the convention that \(H^\epsilon_1(X)=\lim_{\alpha\rightarrow 1} H^\epsilon_\alpha(X)\) and \(H^\epsilon_\infty(X)=\lim_{\alpha\rightarrow \infty} H^\epsilon_\alpha(X)\).
This may look very complicated
We will get back to this quantity later, it is actually quite natural in some cases