Conditional entropy

Let $X|\{Y=y\} \sim P_{X|Y}(x|y)$, where $x\in A$ and $y \in B$. The conditional entropy of $X$ given the event $\{Y=y\}$ is defined by \[ H(X|Y=y) = \sum_{x\in A}P_{X|Y}(x|y)\log\frac{1}{P_{X|Y}(x|y)}. \]

Let $(X,Y) \sim P_{X,Y}(x,y)$. The conditional entropy of $X$ given $Y$ is defined by \[H(X|Y) = \sum_{y \in B} H(X|Y=y)P_Y(y) = \sum_{x\in A,y\in B} P_{X,Y}(x,y)\log\frac{1}{P_{X|Y}(x|y)}. \]

Properties
\[ H(X|Y) \geq 0 \] \[ H(X|Y) \leq H(X) \]
 * Nonnegativity:
 * Conditioning reduces entropy on average:
 * Conditioning on an event does not necessarily reduce the entropy:

\[ H(X|Y=y) \nleq H(X) \]