Channel Capacity
Shannon-Hartley Theorem
Maximum rate of reliable communication over a noisy channel:
$$ C = B \log_2\left(1 + \frac{S}{N}\right) $$
Where:
- $C$ = channel capacity (bits/second)
- $B$ = bandwidth (Hz)
- $S/N$ = signal-to-noise ratio
Example
1import numpy as np
2
3def channel_capacity(bandwidth, snr_db):
4 """Calculate channel capacity"""
5 snr_linear = 10**(snr_db/10)
6 return bandwidth * np.log2(1 + snr_linear)
7
8# 1 MHz bandwidth, 20 dB SNR
9C = channel_capacity(1e6, 20)
10print(f"Capacity: {C/1e6:.2f} Mbps")
Further Reading
Related Snippets
- Data Compression
Lossy vs lossless compression, Huffman coding - Entropy & Information Measures
Shannon entropy, cross-entropy, and KL divergence - Information Theory Basics
Fundamental concepts of information theory - Mutual Information
Measuring dependence between variables