Information Content $$ I(x) = -\log_2 p(x) $$ Rare events carry more information. Key Concepts Entropy: Average information content Mutual Information: Shared information between variables Channel Capacity: Maximum reliable transmission rate Further Reading Information Theory - Wikipedia
Read MoreVisual guide to probability fundamentals and axioms. Axioms of Probability Axiom 1: Non-Negativity $$0 \leq P(A) \leq 1 \text{ for any event } A$$ Probabilities are always between 0 (impossible) and 1 (certain). Axiom 1: Probability Range [0, 1] 0.0 Impossible 0.5 Equally Likely 1.0 Certain Impossible Unlikely (0.25) …
Read More