Entropy rate

For a stochastic process $X=(X_i: i \in \mathbb{N})$, the entropy rate $H(X)$ of $X$ is defined to be $$ H(X)=\lim_{n \to \infty} H(X_{1}^n)/n, $$ when the limit exists. If $X$ is stationary, by the fact that conditioning reduces entropy, the limit does exist and $$ H(X) = \lim_{n \to \infty} H(X_n|X_{1}^{n-1}). $$

For a stationary Markov process $X$, we have $$ H(X)=H(X_2|X_1). $$ If, furthermore, $X$ is an independent process, the entropy rate $H(X)$ reduces to $$ H(X)=H(X_1). $$ Shannon-McMillan-Breiman Theorem is a fundamental theorem in information theory related to entropy rate of a stationary stochastic process.

Entropy Rate of hidden Markov chains
So far, there is no simple and explicit formula for the entropy rate $H(Z)$ of a generic hidden Markov chain. Blackwell has derived an integral formula for $H(Z)$ with a rather simple and explicit integrand, however the measure, with respect to which the integral is taken, is typically too complicated for effective computation of $H(Z)$.

The Birch bound gives upper bounds and lower bounds on the entropy rate $H(Z)$ for a hidden Markov chain $Z$: for any $n$, $$ H(Z_n|Z_1^{n-1}, X_1) \leq H(Z) \leq H(Z_n|Z_1^{n-1}). $$