Entropy and Information Theory

Ranked #55 in Distribution

This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information,... more

Similar Books

If you like Entropy and Information Theory, check out these similar top-rated books:


Learn: What makes Shortform summaries the best in the world?