Estimate values between known data points. SciPy Interpolation 1from scipy import interpolate 2import numpy as np 3 4x = np.array([0, 1, 2, 3, 4]) 5y = np.array([0, 1, 4, 9, 16]) 6 7# Linear 8f_linear = interpolate.interp1d(x, y, kind='linear') 9 10# Cubic spline 11f_cubic = interpolate.interp1d(x, y, …
Read MoreInteractive visualization of Monte Carlo methods for solving complex problems through random sampling. Principle Use random sampling to solve deterministic or stochastic problems. Estimating π - Interactive Animation Integration Concept Monte Carlo integration works by randomly sampling points and determining the ratio …
Read MoreFinite Differences 1# Forward difference 2df_dx = (f(x + h) - f(x)) / h 3 4# Central difference (more accurate) 5df_dx = (f(x + h) - f(x - h)) / (2 * h) 6 7# NumPy gradient 8df_dx = np.gradient(y, x) Further Reading Numerical Differentiation - Wikipedia
Read MoreApproximate $\int_a^b f(x) dx$. Trapezoidal Rule 1from scipy.integrate import trapz, simps, quad 2 3# Trapezoidal 4result = trapz(y, x) 5 6# Simpson's rule 7result = simps(y, x) 8 9# Adaptive quadrature (best) 10result, error = quad(f, a, b) Further Reading Numerical Integration - Wikipedia
Read MoreMinimize $f(x)$. Gradient Descent $$ x_{k+1} = x_k - \alpha \nabla f(x_k) $$ 1def gradient_descent(f, grad_f, x0, alpha=0.01, tol=1e-6, max_iter=1000): 2 x = x0 3 for i in range(max_iter): 4 grad = grad_f(x) 5 if np.linalg.norm(grad) < tol: 6 break 7 x = x - alpha * grad 8 return x SciPy 1from scipy.optimize import …
Read MoreRegularization Techniques
Prevent overfitting by adding penalties to the objective function. L2 Regularization (Ridge) $$ \min_w |Xw - y|^2 + \lambda|w|^2 $$ 1from sklearn.linear_model import Ridge 2 3model = Ridge(alpha=1.0) # alpha = lambda 4model.fit(X, y) L1 Regularization (Lasso) $$ \min_w |Xw - y|^2 + \lambda|w|_1 $$ Promotes sparsity …
Read MoreFind $x$ such that $f(x) = 0$. Newton's Method $$ x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)} $$ 1def newton(f, df, x0, tol=1e-6, max_iter=100): 2 """Newton's method for root finding""" 3 x = x0 4 for i in range(max_iter): 5 fx = f(x) 6 if abs(fx) < tol: 7 return x 8 x = x - fx / df(x) 9 …
Read MoreSolve $Ax = b$. Direct Methods 1import numpy as np 2from scipy.linalg import solve, lu 3 4# Direct solve 5x = np.linalg.solve(A, b) 6 7# LU decomposition 8P, L, U = lu(A) Iterative Methods 1from scipy.sparse.linalg import cg, gmres 2 3# Conjugate gradient (for symmetric positive definite) 4x, info = cg(A, b) 5 6# GMRES …
Read More