Main Page



Welcome,

These are the modest beginnings of an ambitious project. We want to change how researchers distribute information, exchange ideas, and collaborate. To get started, here is a Wiki. Hopefully you will help us filling in content.

The aim is to cover technical topics at a level comparable to a tutorial or a basic book. The exact level of specificity may be hard to describe, but thinking of a good introductory graduate-level course you took may be a good start. Please try to include examples. You can see some conventions we would like to follow. You can also see some convenient formating features in dummy page, and get up to speed on basic Wiki usage on the Wiki quickstart guide. If you have any suggestions, please add them in the suggestion page.

Information Theory
Information theory is a branch of electrical engineering, applied mathematics and theoretical computer science involving the development of a mathematical theory of information.

Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing and communication operations such as compression, reliable communication, and storage of information. Since its inception it has broadened to find applications in many other areas, including statistical inference, cryptography and  networking.

The key measure of information is known as entropy which is measured in bits. Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields, see impact of information theory.

Important sub-fields of information theory are source coding, channel coding, algorithmic information theory, information-theoretic security, and measures of information.

Textbooks on Information Theory

 * Thomas M. Cover, Joy A. Thomas. Elements of information theory, 1st Edition. New York: Wiley-Interscience, 1991. ISBN 0-471-06259-6. :2nd Edition. New York: Wiley-Interscience, 2006. ISBN 0-471-24195-4.


 * Robert Gallager. Information Theory and Reliable Communication. New York: John Wiley and Sons, 1968. ISBN 0-471-29048-3


 * Imre Csiszar, Janos Korner. Information Theory: Coding Theorems for Discrete Memoryless Systems Akademiai Kiado: 2nd edition, 1997. ISBN 963-05-7440-3


 * Claude E. Shannon, Warren Weaver. The Mathematical Theory of Communication. Univ of Illinois Press, 1949. ISBN 0-252-72548-4


 * Robert B. Ash. Information Theory. New York: Interscience, 1965. ISBN 0-470-03445-9. New York: Dover 1990. ISBN 0-486-66521-6


 * Raymond W. Yeung. Information Theory and Network Coding Springer 2008, 2002.  ISBN 978-0-387-79233-0

Other books

 * Leon Brillouin, Science and Information Theory, Mineola, N.Y.: Dover, [1956, 1962] 2004. ISBN 0-486-43918-6

Topics

 * Basic notions
 * Source coding
 * Channel coding
 * Joint Source-Channel Coding
 * Data storage
 * Shannon theory
 * Coding theory
 * Multiterminal information theory


 * Compressive sensing
 * Quantum information theory

Connections to other fields

 * Information theory for wireless
 * Information theory and networks
 * Information theory and control
 * Information theory and learning
 * Information theory and statistics
 * information theory and neuroscience


 * Graphical Models
 * Information theory and game theory
 * Information theory and combinatorics
 * Anonymity in the network
 * Denoising and Filtering


 * Information theory and molecular biology

Consult the User's Guide for information on using the wiki software.