Information Theory Snippets
Foundational concepts and mathematical tools of information theory, including entropy, compression, and channel capacity.
Snippets
- Channel Capacity
Shannon's theorem and noisy channels - Data Compression
Lossy vs lossless compression, Huffman coding - Entropy & Information Measures
Shannon entropy, cross-entropy, and KL divergence - Information Theory Basics
Fundamental concepts of information theory - Mutual Information
Measuring dependence between variables