How Shannon Entropy Imposes Fundamental Limits on Communication

Kevin Hartnett in Quanta:

If someone tells you a fact you already know, they’ve essentially told you nothing at all. Whereas if they impart a secret, it’s fair to say something has really been communicated.

This distinction is at the heart of Claude Shannon’s theory of information. Introduced in an epochal 1948 paper, “A Mathematical Theory of Communication,” it provides a rigorous mathematical framework for quantifying the amount of information needed to accurately send and receive a message, as determined by the degree of uncertainty around what the intended message could be saying.

Which is to say, it’s time for an example.

More here.