Entropy-power inequality

The entropy power inequality is an inequality used in information theory that relates the differential entropy of a sum of random variables to the differential entropy of the summands. This shows that the entropy power is a superadditive function

The inequality
For a random variable $X$ taking values in $\mathbb{R}^n$ with density $f$, the entropy power of $X$ is \begin{align} \sigma_n(X) = \frac{1}{2 \pi e} e^{\frac{2}{n} h(X)}, \end{align} where $h$ is the differential entropy of $X$. The inequality says that random variables $X$ and $Y$, \begin{align} \sigma_n(X + Y) \geq \sigma_n(X) + \sigma_n(Y), \end{align} with equality if and only if $X \sim \mathcal{N}(\mu,\Sigma_X)$ and $Y \sim \mathcal{N}(\mu,\Sigma_Y)$ are jointly Gaussian and $\Sigma_X = \alpha \Sigma_Y$ for some constant $\alpha$.

Connections to other areas of mathematics

 * Generalizations
 * Young's inequality