INFORMATION, STATISTICS, AND PRACTICAL ACHIEVEMENT OF SHANNON CAPACITY Andrew Barron, Yale University The interplay of information and statistics occurs in several areas of activity, including the following, from which we review a selection. 1) Information quantities provide fundamental roles in the understanding of probabilistic phenomena ranging from probability exponents for laws of large numbers, to monotonic approach in the central limit theorem, to convergence of Markov chain distributions, to martingales. 2) Information theory techniques provide fundamental limits in statistical risk characterization, including minimax risks of function estimation. 3) Statistical modeling principles are informed and analyzed by principles of data compression. Characterization of likelihood penalites that are information-theoretically valid and statistically valid. Implications for maximum likelihood, Bayes, and MDL methods. 4) Statistical principles provide formulation and solution of channel communication problems. Sparse superposition encoding and adaptive successive decoding by iterative term extraction in regression with random design. Communication at rates up to capacity for the additive white Gaussian noise channel subject to a power constraint.