Statistical Moments
Visual guide to the four statistical moments that describe a distribution's shape.
The Four Moments
Statistical moments describe different aspects of a probability distribution:
- First Moment: Mean (location/center)
- Second Moment: Variance (spread/dispersion)
- Third Moment: Skewness (asymmetry)
- Fourth Moment: Kurtosis (tail heaviness)
First Moment: Mean (μ)
Mean: The average value, center of mass of the distribution.
$$ \mu = E[X] = \frac{1}{n}\sum_{i=1}^n x_i $$
Interpretation: The mean is the "balance point" of the distribution - where it would balance if placed on a fulcrum.
Second Moment: Variance (σ²)
Variance: Average squared deviation from the mean.
$$ \sigma^2 = E[(X - \mu)^2] = \frac{1}{n}\sum_{i=1}^n (x_i - \mu)^2 $$
Standard Deviation: $\sigma = \sqrt{\sigma^2}$
Interpretation: Variance measures the spread or dispersion of data around the mean. Higher variance means data is more spread out.
Third Moment: Skewness (γ₁)
Skewness: Measure of asymmetry of the distribution.
$$ \gamma_1 = E\left[\left(\frac{X - \mu}{\sigma}\right)^3\right] = \frac{E[(X - \mu)^3]}{\sigma^3} $$
- γ₁ = 0: Symmetric (normal distribution)
- γ₁ > 0: Right-skewed (tail on right, mean > median)
- γ₁ < 0: Left-skewed (tail on left, mean < median)
Interpretation: Skewness measures asymmetry. Positive skew means the tail extends to the right; negative skew means the tail extends to the left.
Fourth Moment: Kurtosis (γ₂)
Kurtosis: Measure of "tailedness" - how heavy or light the tails are.
$$ \gamma_2 = E\left[\left(\frac{X - \mu}{\sigma}\right)^4\right] - 3 $$
- γ₂ = 0: Mesokurtic (normal distribution)
- γ₂ > 0: Leptokurtic (heavy tails, sharp peak, more outliers)
- γ₂ < 0: Platykurtic (light tails, flat peak, fewer outliers)
Interpretation: Kurtosis measures tail heaviness. High kurtosis means more extreme outliers; low kurtosis means fewer outliers.
Interactive Moment Explorer
Summary Table
| Moment | Formula | Measures | Typical Values |
|---|---|---|---|
| 1st: Mean | $E[X]$ | Location/Center | Any real number |
| 2nd: Variance | $E[(X-\mu)^2]$ | Spread/Dispersion | ≥ 0 |
| 3rd: Skewness | $E[((X-\mu)/\sigma)^3]$ | Asymmetry | 0 = symmetric |
| 4th: Kurtosis | $E[((X-\mu)/\sigma)^4] - 3$ | Tail heaviness | 0 = normal |
Python Implementation
1import numpy as np
2from scipy import stats
3
4# Generate data
5data = np.random.normal(100, 15, 1000)
6
7# Calculate moments
8mean = np.mean(data)
9variance = np.var(data)
10std = np.std(data)
11skewness = stats.skew(data)
12kurtosis = stats.kurtosis(data)
13
14print(f"Mean: {mean:.2f}")
15print(f"Variance: {variance:.2f}")
16print(f"Std Dev: {std:.2f}")
17print(f"Skewness: {skewness:.3f}")
18print(f"Kurtosis: {kurtosis:.3f}")
19
20# Interpretation
21if abs(skewness) < 0.5:
22 print("Distribution is approximately symmetric")
23elif skewness > 0:
24 print("Distribution is right-skewed")
25else:
26 print("Distribution is left-skewed")
27
28if abs(kurtosis) < 0.5:
29 print("Distribution has normal tail heaviness")
30elif kurtosis > 0:
31 print("Distribution has heavy tails (more outliers)")
32else:
33 print("Distribution has light tails (fewer outliers)")
Key Takeaways
Moments describe shape: Each moment captures a different aspect of the distribution
Order matters: Higher moments depend on lower moments
Standardization: Skewness and kurtosis use standardized values (z-scores)
Normal distribution: Mean = any, Variance = any, Skewness = 0, Kurtosis = 0
Robustness: Mean and variance are sensitive to outliers; median and IQR are more robust
Further Reading
Related Snippets
- Bayes' Theorem & Applications
Bayesian inference and practical applications - Central Limit Theorem
Foundation of statistical inference - Common Probability Distributions
Normal, Binomial, Poisson, Exponential, Gamma, Pareto distributions - Monte Carlo Methods
Simulation and numerical integration - Null Hypothesis Testing
Understanding null hypothesis and hypothesis testing - P-Values Explained
Understanding p-values and statistical significance - Percentiles and Quantiles
Understanding percentiles, quartiles, and quantiles - Probability Basics
Fundamental probability concepts and rules - Random Variables
Expected value, variance, and moments