Graph entropy

Graph entropy is quantity associated to a graph whose vertex set has a probability distribution $P$.

Definition
There are several equivalent definitions of graph entropy. Let $G = (V,E)$ be a graph on a vertex set $V = \{1, 2, \ldots, k\}$. Let $P$ be a probability distribution on $\{1,2,\ldots,k\}$ with $P(i) = p_i$ for all $i$. Let $\mathcal{I}$ be the collection of independent sets of $G$.

Definition via mutual information
Definition. Let $X$ be a random variable with distribution $P$ and let $Y$ be a random variable taking values in $\mathcal{I}$ such that $\mathbb{P}(X \in Y) = 1$. Then the graph entropy is \begin{align} H(G,P) = \min_{X,Y} I(X;Y). \end{align} where the minimization is over the random variables satisfying these properties.

Definition via graph products and chromatic numbers
The $n$-th co-normal power $G^n$ of the graph $G = V(E$ has vertex and edge sets given by \begin{align} V(G^n) &= V^n \\ E(G^n) &= \{ (x^n,y^n) : (x_i,y_i) \in E \forall i \} \end{align} The chromatic number $\chi(G)$ of a graph is the minimum number of colors needed to color the vertices such that no two adjacent vertices have the same color.

Definition. Then the graph entropy is \begin{align} H(G,P) = \lim_{n \to \infty} \min_{U \subset V^t : P^n(U) > 1 - \epsilon} \frac{1}{t} \log \chi(G^n(U)) \end{align} where $G^n(U)$ is the subgraph of $G^n$ induced by the vertices in $U$.

That is, the graph entropy is the rate of growth of the number of colors needed to color a subset $U$ of the graph product $G^n$. The fact that the definition is equivalent for any $\epsilon \in (0,1)$ is not obvious and requires proof.

Definition via vertex packings
Consider the space $\mathbb{R}^{k}$ and for each independent set $A \in \mathcal{I}$ let $\mathbf{e}(A)$ be the vector with $e_i(A) = 1$ for $i \in A$ and $0$ elsewhere. Let $K$ be the convex hull of $\{\mathbf{e}_A : A \in \mathcal{I}\}$. The set $K$ is called the vertex packing polytope of $G$.

Definition. Then the graph entropy is \begin{align} H(G,P) = \min_{\mathbb{a} \in K} \sum_{i=1}^{k} p_i \log \frac{1}{a_i}. \end{align}

Applications

 * Zero-error capacity
 * Computational complexity