Interactive communication

Introduction
Interactive communicationis a subtopic of source coding with side information. Both concern the communication of information from a sender to a receiver who has related data, and interactive communication studies the effect of interaction on the number of bits required.

More formally, a random-variable pair $(X,Y)$ is distributed according to a known probability distribution $p(x,y)$ over a support set $\subseteq\cal{X}\times\cal{Y}$. A sender knows $X$ and wants to communicate its value to a receiver who knows $Y$. How many bits must be transmitted?

This outline focuses on the worst-case number of bits, for the expected number see, e.g,

This outline describes some of the concepts, results, and problems concerning worst-case transmission for general communication problems. Among them:


 * The savings afforded by interaction are bounded.
 * One-way communication may be exponentially wasteful.
 * Two messages require at most four times the optimal number of bits.
 * It is not know whether a fixed number of messages are always optimal.
 * Additional outlines describing balanced distributions and average-case communication, where stronger results can be proven, will be added soon.

Interactive communication is best introduced via examples, especially the following example.

League
Example (League) Informal description: There are $n$ teams in the $n$-BA. Bob knows two teams that played a game and Alice knows the team that won. How many bits do . On the evening news, Alice hears that the Lakers won, but not whom they beat. As a result, Bob knows the two teams that played (Celtics and Lakers), but not who won (Lakers); Alice knows the winning team, but not who they played against.

How many bits of information must Alice and Bob exchange in the worst case for Bob to find out who won?

Remark: Two slight variations are easy to analyze. If Bob did not know who played, Alice would need to transmit log n bits describing the winning team. (If you're not convinced, or unsure about and, see basic introduction.) If, in addition to the winner, Alice also knew the losing team, she could transmit only one bit indicating whether the winning team is lexicographically lower or (as is the case here) higher. Essentially, we would like to know whether our scenario requires closer to the log n bits of the first variation, or the 1 bit of the second. Note that in this example, like all others considered in this overview, a sender (Alice) wants to convey some information (winner) to a receiver (Bob) who has related data (the two teams that played). One-way communication We first consider the simplest case where only Alice can transmit. We show that in this one-way communication scenario she must send exactly log n bits in the worst case.

$\log n$ bits suffice: As above, even without using Bob's information, Alice and Bob agree in advance on a $\log n$-bit indexing (say lexicographic) of the teams and Alice uses $\log n$ bits to transmit the winner's index.

$\log n$ bits are necessary: Recall that we are considering worst-case complexity.Alice's message to Bob is determined by the winner, for that is all she knows. For example, she may transmit 011 if she hears that the Celtics won and 1100 if she hears that the Pistons won. If for two different teams, for example the Celtics and the Lakers, she transmits the same message, say 011, then in the event that these two teams played each other, Bob will not know who won. It follows that Alice must transmit a different message for every team. Note also that if one message is a prefix of another then if the corresponding teams play, Bob may not know when a message ends. Hence no message can prefix another. There are therefore $n$ different messages, none a prefix of another. At least one requires $\log n$ bits.

From one-way to interaction The preceding analysis assumed that only Alice can transmit so, for example, if two teams are assigned the same message Bob can't indicate he's missing information.

What if interaction is allowed? Can the total number of bits be reduced?

Remark: Recall that (1) we count the bits transmitted by both Alice and Bob; (2) we are interested in the worst-case number of bits; (3) Alice doesn't need to learn anything. So it's not a priori clear whether interaction can reduce communication. Two messages Interaction can reduce communication significantly. We first describe a simple interactive protocol where Alice and Bob exchange just two messages and transmit a total of $\log\log n+1$ bits. Considerably less than the log n bits required with one-way communication.

Alice and Bob agree in advance on a $\log n$-bit representation of each of the n teams. Bob, who knows the two players, considers their binary representations. He transmits to Alice the location of the first bit where the two teams differ. Alice replies with the winner's bit in that location.

Example: Suppose that $n=16$, so that each team is described by 4 bits, and that the binary representations of the Celtics and the Lakers are 0110 and 1010 respectively. Bob uses two bits to transmit the number 3, indicating that 0110 and 1010 first differ in the third (from right) bit location, and Alice replies 0 because the third bit of the Lakers' representation is 0. Bob can then tell that the Lakers won because the Celtics' third bit is 1. To count the number of bits, observe that Bob describes a number between 1 and $\log n$, hence transmits $\log\log n= \log\log n$ bits. Alice transmits 1 bit for a total of $\log\log n+1$ bits.

Remark: Curiously, Bob who doesn't need to convey any information ended up transmitting most of the bits. Alice, the only one who wants to convey information, transmitted just one bit. Any better? Is that the best we can do? Is there a protocol that requires fewer bits in the worst case?

Remark: To reduce the number of bits, we can consider either another two-message protocol, or one that requires more messages but fewer bits. This question is as easily answered for League as it is in the general communication setup which we describe next. We use minimal amount of "jargon" so please bear with it. The general communication scenario In the most general communication problem a sender knows a random value $X$ while a receiver knows a random value $Y$ and wants to determine $X$. We are interested in the number of bits required in the worst case to ensure that the receiver can always determine $X$ correctly.

We assume that $X$ and $Y$ are distributed according to a known probability distribution $p(x,y)$ and that the sender and the receiver agree in advance on a protocol that determines who transmits what, based on the values they hold and previous transmissions.

One can consider various scenarios depending on the number of messages the communicators can exchange. We let Wm denote the least number of bits transmitted by both communicators in the worst case when they are allowed to exchange at most m messages. $C_1$ is therefore the number of bits required with one message, namely only the sender can transmit, and $C_2$ is the total number of bits required with at most two messages: the receiver transmits, and the sender replies.

Allowing additional messages can only reduce the number of bits (we allow, not require, more messages), hence $C_1, C_2, C_3$, ... is a non-increasing sequence of nonnegative integers and therefore converges to a limit $C_∞$ which is the number of bits that must be transmitted in the worst case when there are no restrictions on the number of messages exchanged.

We would like to determine the Wm's for specific problems and the largest possible discrepancies between them for arbitrary problems.

Remark: It is easy to see that since we consider the worst-case number of bits and allow no errors, the exact probability distribution is irrelevant. The Wm's are determined by the support set of p, the set of $(x,y)$ values that can occur. Example: For League, $Y$ is a game, namely a set of two distinct integers between 1 and $n$, and $X$ is a winning team, namely one of the two integers in $Y$. We saw that $C_1=l\og n$ and $C_2=\log\log n+1$. To determine these numbers, we didn't use the probability of the various games and winners. All we cared about was that every two teams could play and that the winner was one of the two teams. Results The following segments describe general results that hold for all communication problems. In particular they apply to the League for which we still want to determine $C_2$ and $C_\infty$.

The limits of interaction
The League Example shows that interaction can significantly reduce communication over the one-way number of bits. Can interaction achieve arbitrary savings? Is there for example a communication problem that requires a trillion bits one-way but only two bits with interaction?

No. The maximum reduction via interaction is logarithmic. For all communication problems: $C_\infty\ge \log C_1+1$. Remark: This bound holds only for worst-case number of bits. For average case there can be an arbitrary improvement even between one and two messages (see average-case results (coming soon)). Application to league Bound (1) can be used to completely determine the number of bits needed for the League Example. We showed that for League $C_1 = \log n$. Therefore,

$$ C_2\ge C_3 \ge C_\infty \ge \log C_1+1 = \log\log n+1 $$ But we also saw that for League $$ C_2 \le \log\log n+1, $$ Hence for League all the inequalities above hold with equality and $$ C_2=C_3=...=C_\infty=\lceil\log\log n\rceil+1. $$ This shows that the two-message protocol we described is optimal. There's no better league protocol, regardless of the number of messages.

One message may be exponentially wasteful Bound (1) cannot be strengthened. It is achieved for all communication problems where (like league), there are two possible X value for every Y value. For all these communication problems,

\[ C_\infty=\log C_1+1. \] How should Equation (2) be interpreted? If multiple messages are acceptable, it is a positive result saying that interaction can save a lot. However if, as in many scenarios, fewer messages are preferable, then, rewriting Equation (2) as

W1=2W∞ - 1 (and modifying it a bit if W1 is not a power of 2), it is a negative result stating that in some problems a single message requires exponentially more bits than the minimum necessary. It is natural to ask whether a small number (albeit >1) of messages suffices to reduce communication to almost the best possible.

Two messages are almost optimal Perhaps the most interesting result applying to all communication problems is that just two messages always suffice to reduce communication to at most four times the optimal. For all communication problems,

W2 4W∞+2. (3) Recall that for some problems (like league) one message is exponentially wasteful. Bound (3) says that for all communication problems, two messages are almost optimal. Hence while there may be a large incentive to move from one to two messages, moving from two to more messages has a limited benefit.

In certain cases, Bound (3) also points us to efficient two-message protocols where they are not a priori obvious. This is illustrated by the following example.

Playoffs As if one league wasn't enough, imagine a tournament with l leagues, each with n teams. As in the (soccer) world cup, two teams from each league proceed to the playoffs and one of these 2l teams wins the playoffs.

Bob know the 2l teams that proceeded to the playoffs. Alice knows the winning team and wants to convey that information to Bob.

If only one message is allowed, Alice must completely describe the winning team. If the message she assigns to one team is the same as, or a prefix of, a message she assigns to another team, then if both teams participate in the playoffs, Bob will not know who won. Hence each team must be assigned a different message from prefix-free set. Therefore,

W1= log nl. It is also easy to find an efficient three-message protocol. Alice can transmit log l bits describing the winning team's league. Thereafter, Alice and Bob can use the League problem's protocol. Bob considers the two teams that proceeded to the playoffs from that league, and uses log log n bits to describe a location where the two teams differ. Alice transmits one bit describing the winning team's bit in that location. It follows that

W3 log l + log log n +1. Efficient two-message protocols are more elusive. Next we use Bound (3) to upper bound W2. Later on, we'll mention a lower bound.

Application of Bound (3) to playoffs Consider the playoffs problem when l is constant and n increases. We saw that

W1 = log n +c1    while    W3  log log n +c2, where c1 = log l and c2 =  log l+1. Hence three messages require significantly fewer bits than one message. A low-rate 2-message protocol is not immediately obvious. Yet Bound (3) implies that

W2 4W∞+2  4log log n + 4c2 + 2. Hence an efficient two-message protocol exists. Are two messages optimal? Two messages are the first achieving near optimality. It is natural to wonder about their precise efficiency. Can bound (3) be improved? Are two messages exactly optimal?

Two messages are not optimal Two messages are not optimal. They may require twice the minimum number of bits.

More specifically, for every α, however large, there is a communication problem such that W3α and

W22W3- o(W3) where o(x) is a term that when divided by x, tends to zero as x goes to ∞. This discrepancy is manifested for the Playoffs problem.

Two messages are not optimal for Playoffs We saw that for the Playoffs problem with l leagues, each with n teams,

W3 log l + log log n +1.

Using graph-coloring arguments, it can be proven that W2 log min(n,2l2). Therefore, as l increases and n=2l2,

W22W3- o(W3),

showing that two messages may require twice the optimal number of bits. Open problem 1: two messages We saw that two messages are always within a factor of four of the optimal and sometimes requires twice the optimal. What is the largest possible discrepancy between W2 and W∞?

Namely, what is the largest constant c such that for all communication problems W2cW∞+o(W∞) ? Again, o(x) is a term that when divided by x, tends to zero as x goes to ∞. Open problem 2: multiple messages One message can require exponentially more than the minimal number of bits. Two messages are always within a factor of four from the optimal, and sometimes require twice the optimal. The most interesting theoretical open problem is whether a fixed number m of messages is always asymptotically optimal:

Is there an m such that for all communication problems,

WmW∞+ o(W∞)? As usual, o(x) is a term that when divided by x, tends to zero as x goes to ∞. An affirmative answer would mean that there is never a strong reason to exchange more than m messages.

The answer to this question is not known, but the rest of the overview describes some partial results.

Three messages are are not optimal If an optimal number of messages exists it is not three either. For every α, however large, there is a communication problem such that W4α and W32W4- o(W4). (4) Four messages are within a factor of three from optimal The lowest number of messages that could potentially be optimal is four. But the only bound know is that four messages are within a factor of three of the optimal. For all communication problems,

W4 3W∞ +o(W∞). (5) Summary We showed that for the worst case number of bits:

The savings afforded by interaction are bounded. For all communication problems, W∞logW1+1. One message may be exponentially wasteful. For some communication problems (League) W1=2W∞ - 1. Two messages are almost optimal. For all communication problems, W2 4W∞+2. Two messages are not optimal. For some communication problems (Playoffs), W22W3- o(W3). Three messages are not optimal either. For some communication problems, W32W4- o(W3). Four messages are within a factor of three from optimal. For all communication problems, W4 3W∞ +o(W∞). Among the more interesting open problems, we mentioned that it is not known:

What is the largest possible discrepancy between W2 and W∞. Namely, what is the largest constant c such that for all communication problems, W2cW∞+o(W∞) ? Whether a fixed number of messages is always optimal. Namely, is there is an m such that for all communication problems, WmW∞+o(W∞)? What's next? To find out more about interactive communication you can read

Interactive communication of balanced distributions describes communication issues for problems that are more likely to arise in practice. For example, when a sender wants to convey a file to a recipient who has a closely related file. It is shown that in all these cases the number of bits required is essentially the same as that needed when the sender knows the receiver's data in advance. (coming "soon") An introduction to average-case interactive communication which describes the results obtained when you consider the average number of bits. In particular, it is shown that there can be arbitrary savings due to interaction and that four messages are asymptotically optimal. References To better understand the material and see the proofs, you can read the following papers. Most of the results are described in:

Average-case complexity
Bound (4) is described in: And bound (5) is proven in:
 * Worst-case interactive communication I: Two messages are almost optimal, A. Orlitsky, IEEE Transactions on Information Theory, 36 (5) (1990), pp. 1111-1126.
 * Worst-case interactive communication II: Two messages are not optimal, A. Orlitsky, IEEE Transactions on Information Theory, 37:4 (July 1991), pp. 995-1005.
 * Three messages are not optimal in worst-case interactive communication, Z. Zhang and X.G. Xia, IEEE Transactions on Information Theory, 40:1, (January 1994), pp. 3-11.
 * On interactive communication, R. Ahlswede, N. Cai and Z. Zhang, IEEE Transactions on Information Theory, 43:1, (January 1997), pp. 22-37.
 * Three results on interactive communication, M. Naor, A. Orlitsky and P. Shor, IEEE Transactions on Information Theory, 39:5 (September 1993), pp. 1608-1615.
 * Related topics (more germane to average case) are discussed in
 * Interactive data compression, A. El Gamal and A. Orlitsky, Proceedings 25th IEEE Symposium on Foundations of Computer Science, October 1984, pp. 100-108.