Foundations

Claude Shannon

1916 – 2001Information theory
0110100110111000.51H(p)p

A Mathematical Theory of Communication

Shannon's 1948 paper created information theory in a single stroke. He defined information mathematically as the reduction of uncertainty: a message is informative precisely to the extent that it is surprising. He introduced the bit as the fundamental unit of information, proved the channel capacity theorem, and established the theoretical limits of lossless data compression — all in one paper.

Entropy and surprise

Shannon entropy measures the average surprise of a random variable: high entropy means high uncertainty, low entropy means predictability. In the context of financial disclosures, a filing section with high entropy contains genuinely novel information, while a low-entropy section is boilerplate. Shannon's framework lets us quantify, precisely, how much information a document actually contains.

From communication to finance

Shannon himself was an active investor and understood the connection between information theory and markets. The Kelly criterion — optimal bet sizing for maximum long-run growth — is a direct application of Shannon's channel capacity theorem. Markets are noisy channels; signals are messages; capital allocation is encoding. Information theory provides the mathematical framework for all three.

Why this matters

Our information density scoring signal is a direct application of Shannon's entropy. We measure how much genuine information each section of a filing contains — distinguishing novel, actionable content from repetitive boilerplate. Shannon tells us exactly what to pay attention to.