SciPy optimize provides functions for minimizing (or maximizing) objective functions, possibly subject to constraints. It includes solvers for nonlinear problems (with support for both local and global optimization algorithms), linear programing, constrained and nonlinear least-squares, root finding and curve fitting.
‘Nelder-Mead’ ,‘Powell’ ,‘CG’ ,‘BFGS’ ,‘Newton-CG’
‘L-BFGS-B’ ,‘TNC’ ,‘COBYLA’ ,‘SLSQP’ ,‘trust-constr’
‘dogleg’ ,‘trust-ncg’ ,‘trust-exact’ ,‘trust-krylov’
Method CG
uses a nonlinear conjugate gradient algorithm by Polak and Ribiere, a variant of the Fletcher-Reeves method described in [5] pp.120-122. Only the first derivatives are used.
import numpy as np
import numpy.linalg as la
import scipy.optimize as sopt
from scipy.optimize import minimize
import matplotlib.pyplot as pt
from mpl_toolkits.mplot3d import axes3d
%matplotlib inline
import seaborn as sns
sns.set()
The minimum value of this function is 0 which is achieved when x =1
def rosen(x):
"""The Rosenbrock function"""
return sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0)
x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])
res = minimize(rosen, x0, method='CG',options={'disp': True})
Optimization terminated successfully. Current function value: 0.000000 Iterations: 67 Function evaluations: 973 Gradient evaluations: 139
res.x
array([0.99999927, 0.99999853, 0.99999706, 0.99999411, 0.99998819])