Distributed source coding

Distributed source coding (DSC) is an important problem in communications and multiterminal information theory. In the DSC setting the encoders perform independent encoding, while the decoder performs joint decoding of all sources.

Lossless distributed source coding
In lossless distributed source coding the decoder reconstructs all the encoded sources without any distortion.

In 1973 David Slepian and Jack Wolf discovered the surprising result for independent encoding and joint decoding of two jointly distributed sources. They found that the sum rate to which the two encoders can compress the data remains the same as when the joint encoding of the two sources is performed.

Slepian-Wolf Theorem

Given two finite alphabet i.i.d sources $(X_1,X_2)$ with joint probability distribution $p_{X_1,X_2}(x_1,x_2)$. A rate vector $(R_1,R_2)$ is achievable if and only if \begin{align} R_1 &> H(X_1|X_2) \\ R_2 &>H(X_2|X_1) \\ R_1+R_2 &> H(X_1,X_2). \end{align}

In 1975 Thomas Cover extended the above result to multiple ergodic sources.

The Slepian-Wolf example demonstrates a fundamental property of distributed source coding systems -- joint dependence of sources enables a tradeoff between rates required from different sources. Other examples where this property has been observed include:

Source coding with coded side information Gray-Wyner network