Optimization Methods

Minimize $f(x)$.

Gradient Descent

$$ x_{k+1} = x_k - \alpha \nabla f(x_k) $$

1def gradient_descent(f, grad_f, x0, alpha=0.01, tol=1e-6, max_iter=1000):
2    x = x0
3    for i in range(max_iter):
4        grad = grad_f(x)
5        if np.linalg.norm(grad) < tol:
6            break
7        x = x - alpha * grad
8    return x

SciPy

1from scipy.optimize import minimize
2
3result = minimize(f, x0, method='BFGS', jac=grad_f)
4x_opt = result.x

Further Reading

Related Snippets