Information Theory Basics

Information Content

$$ I(x) = -\log_2 p(x) $$

Rare events carry more information.

Key Concepts

  • Entropy: Average information content
  • Mutual Information: Shared information between variables
  • Channel Capacity: Maximum reliable transmission rate

Further Reading

Related Snippets