# Minimization¶

Author: Omar Zapata
This notebook tutorial was automatically generated with ROOTBOOK-izer from the macro found in the ROOT repository on Monday, August 15, 2022 at 09:40 AM.

In [ ]:
%%cpp -d
#include<TRInterface.h>


in the next function the *double pointer must be changed by a TVectorD, because the pointer has no meaning in R enviroment.

In [ ]:
%%cpp -d
Double_t RosenBrock(const TVectorD xx )
{
const Double_t x = xx;
const Double_t y = xx;
const Double_t tmp1 = y-x*x;
const Double_t tmp2 = 1-x;
return 100*tmp1*tmp1+tmp2*tmp2;
}


Definition of a helper function:

In [ ]:
%%cpp -d
{
const Double_t x = xx;
const Double_t y = xx;
grad=-400 * x * (y - x * x) - 2 * (1 - x);
grad=200 * (y - x * x);
}

In [ ]:
ROOT::R::TRInterface &r=ROOT::R::TRInterface::Instance();


passsing RosenBrock function to R

In [ ]:
r["RosenBrock"]=ROOT::R::TRFunctionExport(RosenBrock);


In [ ]:
r["RosenBrockGrad"]=ROOT::R::TRFunctionExport(RosenBrockGrad);


the option "method" could be "Nelder-Mead", "BFGS", "CG", "L-BFGS-B", "SANN","Brent"

the option "control" lets you put some constraints like "maxit" The maximum number of iterations. "abstol" The absolute convergence tolerance.

In [ ]:
r.Execute("result <- optim( c(0.01,0.01), RosenBrock,method='BFGS',control = list(maxit = 1000000) )");


"reltol" Relative convergence tolerance.

Getting results from R

In [ ]:
TVectorD  min=r.Eval("result$par"); std::cout.precision(8);  printing results In [ ]: std::cout<<"-----------------------------------------"<<std::endl; std::cout<<"Minimum x="<<min<<" y="<<min<<std::endl; std::cout<<"Value at minimum ="<<RosenBrock(min)<<std::endl;  using the gradient In [ ]: r.Execute("optimHess(result$par, RosenBrock, RosenBrockGrad)");
r.Execute("hresult <- optim(c(-1.2,1), RosenBrock, NULL, method = 'BFGS', hessian = TRUE)");


getting the min calculated with the gradient

In [ ]:
TVectorD  hmin=r.Eval("hresult\$par");


printing results

In [ ]:
std::cout<<"-----------------------------------------"<<std::endl;