Shannon-Hartley Theorem Maximum rate of reliable communication over a noisy channel: $$ C = B \log_2\left(1 + \frac{S}{N}\right) $$ Where: $C$ = channel capacity (bits/second) $B$ = bandwidth (Hz) $S/N$ = signal-to-noise ratio Example 1import numpy as np 2 3def channel_capacity(bandwidth, snr_db): 4 …
Read MoreLossless vs Lossy Lossless: Perfect reconstruction (ZIP, PNG, FLAC) Lossy: Approximate reconstruction (JPEG, MP3, H.264) Huffman Coding Optimal prefix-free code for known symbol probabilities. 1import heapq 2from collections import Counter 3 4def huffman_encoding(data): 5 """Build Huffman tree and encode …
Read MoreShannon Entropy Average information content: $$ H(X) = -\sum_i p(x_i) \log_2 p(x_i) $$ Units: bits (if log base 2), nats (if natural log) 1import numpy as np 2 3def entropy(probabilities): 4 """Calculate Shannon entropy""" 5 p = np.array(probabilities) 6 p = p[p > 0] # Remove zeros 7 return …
Read MoreInformation Content $$ I(x) = -\log_2 p(x) $$ Rare events carry more information. Key Concepts Entropy: Average information content Mutual Information: Shared information between variables Channel Capacity: Maximum reliable transmission rate Further Reading Information Theory - Wikipedia
Read MoreFoundational concepts and mathematical tools of information theory, including entropy, compression, and channel capacity.
Read MoreDefinition Measures how much knowing one variable reduces uncertainty about another: $$ I(X;Y) = \sum_{x,y} p(x,y) \log \frac{p(x,y)}{p(x)p(y)} $$ Properties $I(X;Y) = I(Y;X)$ (symmetric) $I(X;Y) \geq 0$ (non-negative) $I(X;X) = H(X)$ (self-information is entropy) $I(X;Y) = 0$ if $X$ and $Y$ are independent Python …
Read More