Common information

Several information theoretic quantifications of correlation have been referred to as common information.

Wyner's Common Information
\begin{equation} C_W(X;Y) \quad = \quad \min_{X-U-Y} I(X,Y;U). \end{equation}

G&aacute;cs and K&ouml;rner
G&aacute;cs /'ga:tʃ/ and K&ouml;rner /'kɜr-nɜr/

\begin{equation} C_GK(X;Y) \quad = \quad \max_{f(X) = g(Y)} H(f(X)). \end{equation}

Properties
Wyner's common information and G&aacute;cs-K&ouml;rner common information play a complementary role in many setting. Their relationship to mutual information is expressed by the following inequalities.

\begin{equation} 0 \quad \leq C_{GK}(X;Y) \quad \leq \quad I(X;Y) \quad \leq \quad C_W(X;Y) \quad \leq \quad \min \{H(X),H(Y)\}. \end{equation}

Proof

 * 1) $0 \leq C_{GK} (X;Y)$

Positivity of entropy


 * 1) $C_{GK} (X;Y) \leq I(X;Y)$

Start by observing that $0 = H(f(X)|g(Y))$ for all valid $f$ and $g$ in the maximization. Therefore, $H(f(X)) = I(f(X);g(Y)) \leq I(X;Y)$ by the data processing inequality.


 * 1) $I(X;Y) \leq C_W(X;Y)$

For all $U$ such that $X-U-Y$ form a Markov chain, $I(X,Y;U) \geq I(X;U) \geq I(X;Y)$.


 * 1) $C_W(X;Y) \leq \min \{H(X),H(Y)\}$

Choose $U=X$ as a valid choice in the set over which the minimization occurs. This is a valid choice because $X-X-Y$ forms a Markov chain. Therefore, $I(X,Y;U) = I(X,Y;X) = H(X)$. We conclude that $C_W(X;Y)$ is not more than $H(X)$. The same steps with $U=Y$ show that $C_W(X;Y) \leq H(Y)$.

Gray-Wyner Network
Wyner's common information and G&aacute;cs-K&ouml; are opposite extremizations of the Gray-Wyner network. Consider two sources encoded and sent to two destinations, with one source needed losslessly at each destination. The transmission is sent over three noiseless rate-limited channels. These transmission channels are bundled together but split at some point to connect to the two separated destinations. At the point of the split,...

Wyner's common information is the answer to...