Relay channel

The relay channel is an information-theoretic model for communication in which one or multiple relays assists in the transmission of a message from a transmitter to a receiver.

= Channel model =

Consider a three-node communication system. The sender (node 1) wants to transmit a message $M$ to the receiver (node 3) with the help of a relay (node 2).

Discrete memoryless relay channels
Consider a discrete memoryless relay channel $(\mathcal{X}_1 \times \mathcal{X}_2, p(y_1,y_2|x_1,x_2), \mathcal{Y}_1 \times \mathcal{Y}_2)$ model.

A $(2^{nR},n)$ code for the discrete memoryless relay channel consist of
 * a message set $[1:2^{nR}]$,
 * an encoder that assigns a codeword $x_1^n(m)$ to each massage $m \in [1:2^{nR}]$,
 * a relay encoder that assigns at time $i \in [1:n]$ a symbol $x_{2i}(y_2^{i-1})$ to each past received sequence $y_2^{i-1} \in \mathcal{Y}_2^{i-1}$, and
 * a decoder that assigns a message $\hat{m}$ or an error message $e$ to each received sequence $y_3^n \in \mathcal{Y}_3^n$

\[ P_e^{(n)} = \mathsf{P}\{\hat{M} \neq M\} \]
 * The channel is memoryless in the sense that $(X_1^{i-1},X_2^{i-1},Y_2^{i-1},Y_3^{i-1}) \rightarrow (X_{1i},X_{2i}) \rightarrow (Y_{2i}, Y_{3i})$ form a Markov chain.
 * The message $M$ is uniformly distributed over $[1:2^{nR}]$.
 * The average probability of error is defined as
 * $R$ is said to be achievable is there exists a sequence of $(2^{nR},n)$ codes with $P_e^{(n)} \rightarrow 0$ as $n \rightarrow \infty$
 * The capacity  $C$ is hte supremum of all achievable rate.

Gaussian relay channels
= Coding schemes = The coding schemes for relay channels are actually the cooperative strategies between different links. In the relay channels, there are multiple links between the sources and destinations. Thus, how to efficiently utilize and cooperate these links is the coding scheme for relay channel. Although the optimal scheme is still unknown, three popular coding schemes are demonstrated with the simplest relay channel: three-nodes relay channel.

Amplify-and-forward
Amplify-and-Forward is the simplest coding scheme for relay channel.

Decode-and-forward
In Decode-and-Forward (DF), the relay decodes the message from the source and forwards it. The destination recovers one part of information (a list of possible messages) from the relay-destination link and the other part (another list of possible messages) from source-destination link, and then combine them (intersect two lists) to determine the unique message. The constraint of DF is that the relay needs to decode the message to cooperate with the direct link. This brings the 'unecessary' rate constraint. DF achieve the capacity of degraded relay channel. Also, it is interesting to notice DF achieves a rate higher than the capacity of source-relay-destination link.

Compress-and-forward
In Compress-and-Forward (CF), the relay does not decode the message. Instead, it compresses/quantizes the received signal and forwards the compression index. The destination first recovers the quantized received signal at the relay with side information from direct link, and then decodes the message from this recovered signal and its own received signal. It is worthy to mention that CF achieves a rate higher than the source-destination direct link. The limitation of CF is that when the relay compresses its received signal, it treats it as natually randomly generated signal and overlooks the source codebook structure. CF is generalized to arbitrary large networts as 'noisy network coding'.

= Code constructions =

Lattice coding for Gaussian relay channels
Lattice code is a linear code in Euclidean space. Lattice encoding and decoding can achieve the capacity of AWGN point-to-point channel, Multiple-access channel and Broadcast channel. For the classic AWGN three-nodes relay channel, lattice codes can achieve the rate region of Amplify-and-Forward, Decode-and-Forward and Compress-and-Forward. Lattice codes can outperform the random codes in some scenarios because of its linearity property. In the Gaussian relay networks, the relay can choose to decode a linear combination of received messages/codewords rather than them individually with less rates constraints. This can only be done with a linear code, such as lattice codes, rather than random codes. An good example to illustrate the advantage of lattice codes is the full-duplex Gaussian two-way relay channel. In this model, two terminals want to transmit messages to each other through a relay. By using lattice codes, the relay can decode the sum of two codewords without decoding them individually and broadcast it. With the side information(their own messages), both terminals can recover their desired messages from the sum. From this example, lattice code are shown to be promising to replace random codes in the Gaussian networks. But whether lattice codes can achieve any rate region of random codes is still an open question.

LDPC codes
= History =