Rényi entropy

Definition
For a finite or countable alphabet the Rényi entropy of order $\alpha\neq 0,1$ is defined as $H_{\alpha}(P)=\frac{1}{1-\alpha}\log\sum p^{\alpha}_{i}$ where $p_{1},p_{2},\dots,p_{n}$ denote the point probabilities. For $\alpha = -\infty,0,1,\infty$ the Rényi entropy is defined as a limit of Rényi entropies of values where it is defined by the above formula. For $\alpha = 0$ the limit is taken from above and $H_{0}$ equals the Hartley entropy, i.e. the logarithm of the number of elements in the support of $P$. For $\alpha = 1$ the limit is taken from below and $H_{1}$ equals the Shannon entropy. For $\alpha = \infty$ one gets $H_{\infty}=-\log (p_{max})$ where $p_{max}$ denotes the maximal point probability, so the Rényi entropy of order infinity equals the so-called min-entropy.

Properties

 * The Rényi entropy equals zero if and only if the distribution is concentrated in a point.
 * Rényi entropy is a decreasing function of its order.
 * Rényi entropy is additive, i.e. $R_{\alpha}(P\times Q) = R_{\alpha}(P) + R_{\alpha}(Q)$.
 * Rényi entropy is concave for $\alpha \in [0,1]$.

Differential Rényi entropy
For a real random variable with probability density $f$ the differential Rényi entropy of order $\alpha\neq 0,1$ is defined as $\frac{1}{1- \alpha } \int (f(x))^{\alpha}\,dx$. The formula for differential Rényi entropy is not always well-defined as an extended real number, but for most of common distributions the integral is finite and can be calculated exactly.

History
Rényi entropy was introduced by the Hungarian mathematician Alfred Rényi as a generalization of Shannon entropy. Of all the generalizations of Shannon entropy it is by far the most successful. Rényi had application in probability theory in mind when he defined his new quantity, but the first applications were in the theory of random graphs. In recent years Rényi entropies have found a lot of new applications in all brances of information theory. In physics many distributions with heavy tails are identified as maximizers of Rényi entropy. The physicists often use Tsallis entropy instead of Rényi entropy, but since Tsallis entropy is simply a function of Rényi entropy when will arive at the same distribution when any of the entropies are maximized.