Information Theory Basics
Information Content
$$ I(x) = -\log_2 p(x) $$
Rare events carry more information.
Key Concepts
- Entropy: Average information content
- Mutual Information: Shared information between variables
- Channel Capacity: Maximum reliable transmission rate
Further Reading
Related Snippets
- Channel Capacity
Shannon's theorem and noisy channels - Data Compression
Lossy vs lossless compression, Huffman coding - Entropy & Information Measures
Shannon entropy, cross-entropy, and KL divergence - Mutual Information
Measuring dependence between variables