T M V A Regression

This macro provides examples for the training and testing of the TMVA classifiers.

As input data is used a toy-MC sample consisting of four Gaussian-distributed and linearly correlated input variables.

The methods to be used can be switched on and off by means of booleans, or via the prompt command, for example:

root -l TMVARegression.C\(\"LD,MLP\"\)

(note that the backslashes are mandatory) If no method given, a default set is used.

The output file "TMVAReg.root" can be analysed with the use of dedicated macros (simply say: root -l <macro.C>), which can be conveniently invoked through a GUI that will appear at the end of the run of this macro.

  • Project : TMVA - a Root-integrated toolkit for multivariate data analysis
  • Package : TMVA
  • Root Macro: TMVARegression

Author: Andreas Hoecker
This notebook tutorial was automatically generated with ROOTBOOK-izer from the macro found in the ROOT repository on Monday, December 06, 2021 at 10:54 AM.

In [1]:
%%cpp -d
#include <cstdlib>
#include <iostream>
#include <map>
#include <string>

#include "TChain.h"
#include "TFile.h"
#include "TTree.h"
#include "TString.h"
#include "TObjString.h"
#include "TSystem.h"
#include "TROOT.h"

#include "TMVA/Tools.h"
#include "TMVA/Factory.h"
#include "TMVA/DataLoader.h"
#include "TMVA/TMVARegGui.h"


using namespace TMVA;

Arguments are defined.

In [2]:
TString myMethodList = "";

The explicit loading of the shared libtmva is done in tmvalogon.c, defined in .rootrc if you use your private .rootrc, or run from a different directory, please copy the corresponding lines from .rootrc

Methods to be processed can be given as an argument; use format:

 mylinux~> root -l TMVARegression.C\(\"myMethod1,myMethod2,myMethod3\"\)

This loads the library

In [3]:
TMVA::Tools::Instance();

Default mva methods to be trained + tested

In [4]:
std::map<std::string,int> Use;

Mutidimensional likelihood and nearest-neighbour methods

In [5]:
Use["PDERS"]           = 0;
Use["PDEFoam"]         = 1;
Use["KNN"]             = 1;

Linear Discriminant Analysis

In [6]:
Use["LD"]              = 1;

Function Discriminant analysis

In [7]:
Use["FDA_GA"]          = 0;
Use["FDA_MC"]          = 0;
Use["FDA_MT"]          = 0;
Use["FDA_GAMT"]        = 0;

Neural Network

In [8]:
Use["MLP"]             = 0;

Deep neural network (with cpu or gpu)

In [9]:
#ifdef R__HAS_TMVAGPU
Use["DNN_GPU"] = 1;
Use["DNN_CPU"] = 0;
#else
Use["DNN_GPU"] = 0;
#ifdef R__HAS_TMVACPU
Use["DNN_CPU"] = 1;
#else
Use["DNN_CPU"] = 0;
#endif
#endif
Unbalanced braces. This cell was not processed.

Support Vector Machine

In [10]:
Use["SVM"]             = 0;

Boosted Decision Trees

In [11]:
Use["BDT"]             = 0;
Use["BDTG"]            = 1;

In [12]:
std::cout << std::endl;
std::cout << "==> Start TMVARegression" << std::endl;
==> Start TMVARegression

Select methods (don't look at this code - not of interest)

In [13]:
if (myMethodList != "") {
   for (std::map<std::string,int>::iterator it = Use.begin(); it != Use.end(); it++) it->second = 0;

   std::vector<TString> mlist = gTools().SplitString( myMethodList, ',' );
   for (UInt_t i=0; i<mlist.size(); i++) {
      std::string regMethod(mlist[i].Data());

      if (Use.find(regMethod) == Use.end()) {
         std::cout << "Method \"" << regMethod << "\" not known in TMVA under this name. Choose among the following:" << std::endl;
         for (std::map<std::string,int>::iterator it = Use.begin(); it != Use.end(); it++) std::cout << it->first << " ";
         std::cout << std::endl;
         return;
      }
      Use[regMethod] = 1;
   }
}

Here the preparation phase begins

Create a new root output file

In [14]:
TString outfileName( "TMVAReg.root" );
TFile* outputFile = TFile::Open( outfileName, "RECREATE" );

Create the factory object. later you can choose the methods whose performance you'd like to investigate. The factory will then run the performance analysis for you.

The first argument is the base of the name of all the weightfiles in the directory weight/

The second argument is the output file for the training results All TMVA output can be suppressed by removing the "!" (not) in front of the "Silent" argument in the option string

In [15]:
TMVA::Factory *factory = new TMVA::Factory( "TMVARegression", outputFile,
                                            "!V:!Silent:Color:!DrawProgressBar:AnalysisType=Regression" );


TMVA::DataLoader *dataloader=new TMVA::DataLoader("dataset");

If you wish to modify default settings (please check "src/Config.h" to see all available global options)

 (TMVA::gConfig().GetVariablePlotting()).fTimesRMS = 8.0;
 (TMVA::gConfig().GetIONames()).fWeightFileDir = "myWeightDirectory";

Define the input variables that shall be used for the mva training note that you may also use variable expressions, such as: "3var1/var2abs(var3)" [all types of expressions that can also be parsed by TTree::Draw( "expression" )]

In [16]:
dataloader->AddVariable( "var1", "Variable 1", "units", 'F' );
dataloader->AddVariable( "var2", "Variable 2", "units", 'F' );

You can add so-called "spectator variables", which are not used in the mva training, but will appear in the final "TestTree" produced by TMVA. This TestTree will contain the input variables, the response values of all trained MVAs, and the spectator variables

In [17]:
dataloader->AddSpectator( "spec1:=var1*2",  "Spectator 1", "units", 'F' );
dataloader->AddSpectator( "spec2:=var1*3",  "Spectator 2", "units", 'F' );

Add the variable carrying the regression target

In [18]:
dataloader->AddTarget( "fvalue" );

It is also possible to declare additional targets for multi-dimensional regression, ie: factory->AddTarget( "fvalue2" ); BUT: this is currently ONLY implemented for MLP

Read training and test data (see tmvaclassification for reading ascii files) load the signal and background event samples from ROOT trees

In [19]:
TFile *input(0);
TString fname = "./tmva_reg_example.root";
if (!gSystem->AccessPathName( fname )) {
   input = TFile::Open( fname ); // check if file in local directory exists
}
else {
   TFile::SetCacheFileDir(".");
   input = TFile::Open("http://root.cern.ch/files/tmva_reg_example.root", "CACHEREAD"); // if not: download from ROOT server
}
if (!input) {
   std::cout << "ERROR: could not open data file" << std::endl;
   exit(1);
}
std::cout << "--- TMVARegression           : Using input file: " << input->GetName() << std::endl;
--- TMVARegression           : Using input file: ./files/tmva_reg_example.root
Info in <TFile::OpenFromCache>: using local cache copy of http://root.cern.ch/files/tmva_reg_example.root [./files/tmva_reg_example.root]

Register the regression tree

In [20]:
TTree *regTree = (TTree*)input->Get("TreeR");

Global event weights per tree (see below for setting event-wise weights)

In [21]:
Double_t regWeight  = 1.0;

You can add an arbitrary number of regression trees

In [22]:
dataloader->AddRegressionTree( regTree, regWeight );
DataSetInfo              : [dataset] : Added class "Regression"
                         : Add Tree TreeR of type Regression with 10000 events

This would set individual event weights (the variables defined in the expression need to exist in the original TTree)

In [23]:
dataloader->SetWeightExpression( "var1", "Regression" );

Apply additional cuts on the signal and background samples (can be different)

In [24]:
TCut mycut = ""; // for example: TCut mycut = "abs(var1)<0.5 && abs(var2-0.5)<1";

Tell the dataloader to use all remaining events in the trees after training for testing:

In [25]:
dataloader->PrepareTrainingAndTestTree( mycut,
                                      "nTrain_Regression=1000:nTest_Regression=0:SplitMode=Random:NormMode=NumEvents:!V" );
                         : Dataset[dataset] : Class index : 0  name : Regression
 dataloader->PrepareTrainingAndTestTree( mycut,
        "nTrain_Regression=0:nTest_Regression=0:SplitMode=Random:NormMode=NumEvents:!V" );

If no numbers of events are given, half of the events in the tree are used for training, and the other half for testing:

 dataloader->PrepareTrainingAndTestTree( mycut, "SplitMode=random:!V" );

Book mva methods

Please lookup the various method configuration options in the corresponding cxx files, eg: src/MethoCuts.cxx, etc, or here: http://tmva.sourceforge.net/optionRef.html it is possible to preset ranges in the option string in which the cut optimisation should be done: "...:CutRangeMin[2]=-1:CutRangeMax[2]=1"...", where [2] is the third input variable

Pde - rs method

In [26]:
if (Use["PDERS"])
   factory->BookMethod( dataloader,  TMVA::Types::kPDERS, "PDERS",
                        "!H:!V:NormTree=T:VolumeRangeMode=Adaptive:KernelEstimator=Gauss:GaussSigma=0.3:NEventsMin=40:NEventsMax=60:VarTransform=None" );

And the options strings for the minmax and rms methods, respectively:

  "!H:!V:VolumeRangeMode=MinMax:DeltaFrac=0.2:KernelEstimator=Gauss:GaussSigma=0.3" );
  "!H:!V:VolumeRangeMode=RMS:DeltaFrac=3:KernelEstimator=Gauss:GaussSigma=0.3" );
In [27]:
if (Use["PDEFoam"])
    factory->BookMethod( dataloader,  TMVA::Types::kPDEFoam, "PDEFoam",
          "!H:!V:MultiTargetRegression=F:TargetSelection=Mpv:TailCut=0.001:VolFrac=0.0666:nActiveCells=500:nSampl=2000:nBin=5:Compress=T:Kernel=None:Nmin=10:VarTransform=None" );
Factory                  : Booking method: PDEFoam
                         : 
                         : Rebuilding Dataset dataset
                         : Building event vectors for type 2 Regression
                         : Dataset[dataset] :  create input formulas for tree TreeR
DataSetFactory           : [dataset] : Number of events in input trees
                         : 
                         : Number of training and testing events
                         : ---------------------------------------------------------------------------
                         : Regression -- training events            : 1000
                         : Regression -- testing events             : 9000
                         : Regression -- training and testing events: 10000
                         : 
DataSetInfo              : Correlation matrix (Regression):
                         : ------------------------
                         :             var1    var2
                         :    var1:  +1.000  +0.006
                         :    var2:  +0.006  +1.000
                         : ------------------------
DataSetFactory           : [dataset] :  
                         : 

K-nearest neighbour classifier (knn)

In [28]:
if (Use["KNN"])
   factory->BookMethod( dataloader,  TMVA::Types::kKNN, "KNN",
                        "nkNN=20:ScaleFrac=0.8:SigmaFact=1.0:Kernel=Gaus:UseKernel=F:UseWeight=T:!Trim" );
Factory                  : Booking method: KNN
                         : 

Linear discriminant

In [29]:
if (Use["LD"])
   factory->BookMethod( dataloader,  TMVA::Types::kLD, "LD",
                        "!H:!V:VarTransform=None" );
Factory                  : Booking method: LD
                         : 

Function discrimination analysis (fda) -- test of various fitters - the recommended one is minuit (or ga or sa)

In [30]:
if (Use["FDA_MC"])
   factory->BookMethod( dataloader,  TMVA::Types::kFDA, "FDA_MC",
                       "!H:!V:Formula=(0)+(1)*x0+(2)*x1:ParRanges=(-100,100);(-100,100);(-100,100):FitMethod=MC:SampleSize=100000:Sigma=0.1:VarTransform=D" );

if (Use["FDA_GA"]) // can also use Simulated Annealing (SA) algorithm (see Cuts_SA options) .. the formula of this example is good for parabolas
   factory->BookMethod( dataloader,  TMVA::Types::kFDA, "FDA_GA",
                        "!H:!V:Formula=(0)+(1)*x0+(2)*x1:ParRanges=(-100,100);(-100,100);(-100,100):FitMethod=GA:PopSize=100:Cycles=3:Steps=30:Trim=True:SaveBestGen=1:VarTransform=Norm" );

if (Use["FDA_MT"])
   factory->BookMethod( dataloader,  TMVA::Types::kFDA, "FDA_MT",
                        "!H:!V:Formula=(0)+(1)*x0+(2)*x1:ParRanges=(-100,100);(-100,100);(-100,100);(-10,10):FitMethod=MINUIT:ErrorLevel=1:PrintLevel=-1:FitStrategy=2:UseImprove:UseMinos:SetBatch" );

if (Use["FDA_GAMT"])
   factory->BookMethod( dataloader,  TMVA::Types::kFDA, "FDA_GAMT",
                        "!H:!V:Formula=(0)+(1)*x0+(2)*x1:ParRanges=(-100,100);(-100,100);(-100,100):FitMethod=GA:Converger=MINUIT:ErrorLevel=1:PrintLevel=-1:FitStrategy=0:!UseImprove:!UseMinos:SetBatch:Cycles=1:PopSize=5:Steps=5:Trim" );

Neural network (mlp)

In [31]:
if (Use["MLP"])
   factory->BookMethod( dataloader,  TMVA::Types::kMLP, "MLP", "!H:!V:VarTransform=Norm:NeuronType=tanh:NCycles=20000:HiddenLayers=N+20:TestRate=6:TrainingMethod=BFGS:Sampling=0.3:SamplingEpoch=0.8:ConvergenceImprove=1e-6:ConvergenceTests=15:!UseRegulator" );

if (Use["DNN_CPU"] || Use["DNN_GPU"]) {

   TString archOption =  Use["DNN_GPU"] ? "GPU" : "CPU";

   TString layoutString("Layout=TANH|50,TANH|50,TANH|50,LINEAR");


   TString trainingStrategyString("TrainingStrategy=");

   trainingStrategyString +="LearningRate=1e-3,Momentum=0.3,ConvergenceSteps=20,BatchSize=50,TestRepetitions=1,WeightDecay=0.0,Regularization=None,Optimizer=Adam";

   TString nnOptions("!H:V:ErrorStrategy=SUMOFSQUARES:VarTransform=G:WeightInitialization=XAVIERUNIFORM:Architecture=");
   nnOptions.Append(archOption);
   nnOptions.Append(":");
   nnOptions.Append(layoutString);
   nnOptions.Append(":");
   nnOptions.Append(trainingStrategyString);

   TString methodName = TString("DNN_") + archOption;

   factory->BookMethod(dataloader, TMVA::Types::kDL, methodName, nnOptions); // NN
}

Support vector machine

In [32]:
if (Use["SVM"])
   factory->BookMethod( dataloader,  TMVA::Types::kSVM, "SVM", "Gamma=0.25:Tol=0.001:VarTransform=Norm" );

Boosted decision trees

In [33]:
if (Use["BDT"])
  factory->BookMethod( dataloader,  TMVA::Types::kBDT, "BDT",
                        "!H:!V:NTrees=100:MinNodeSize=1.0%:BoostType=AdaBoostR2:SeparationType=RegressionVariance:nCuts=20:PruneMethod=CostComplexity:PruneStrength=30" );

if (Use["BDTG"])
  factory->BookMethod( dataloader,  TMVA::Types::kBDT, "BDTG",
                        "!H:!V:NTrees=2000::BoostType=Grad:Shrinkage=0.1:UseBaggedBoost:BaggedSampleFraction=0.5:nCuts=20:MaxDepth=3:MaxDepth=4" );
Factory                  : Booking method: BDTG
                         : 
<WARNING>                : Value for option maxdepth was previously set to 3
                         : the option NegWeightTreatment=InverseBoostNegWeights does not exist for BoostType=Grad
                         : --> change to new default NegWeightTreatment=Pray

Now you can tell the factory to train, test, and evaluate the mvas

Train mvas using the set of training events

In [34]:
factory->TrainAllMethods();
Factory                  : Train all methods
Factory                  : [dataset] : Create Transformation "I" with events from all classes.
                         : 
                         : Transformation, Variable selection : 
                         : Input : variable 'var1' <---> Output : variable 'var1'
                         : Input : variable 'var2' <---> Output : variable 'var2'
TFHandler_Factory        : Variable        Mean        RMS   [        Min        Max ]
                         : -----------------------------------------------------------
                         :     var1:     3.3759     1.1674   [  0.0058046     4.9975 ]
                         :     var2:     2.4823     1.4587   [  0.0032142     4.9971 ]
                         :   fvalue:     165.93     84.643   [     2.0973     391.01 ]
                         : -----------------------------------------------------------
                         : Ranking input variables (method unspecific)...
IdTransformation         : Ranking result (top variable is best ranked)
                         : --------------------------------------------
                         : Rank : Variable  : |Correlation with target|
                         : --------------------------------------------
                         :    1 : var2      : 7.636e-01
                         :    2 : var1      : 5.936e-01
                         : --------------------------------------------
IdTransformation         : Ranking result (top variable is best ranked)
                         : -------------------------------------
                         : Rank : Variable  : Mutual information
                         : -------------------------------------
                         :    1 : var2      : 2.315e+00
                         :    2 : var1      : 1.882e+00
                         : -------------------------------------
IdTransformation         : Ranking result (top variable is best ranked)
                         : ------------------------------------
                         : Rank : Variable  : Correlation Ratio
                         : ------------------------------------
                         :    1 : var1      : 6.545e+00
                         :    2 : var2      : 2.414e+00
                         : ------------------------------------
IdTransformation         : Ranking result (top variable is best ranked)
                         : ----------------------------------------
                         : Rank : Variable  : Correlation Ratio (T)
                         : ----------------------------------------
                         :    1 : var2      : 8.189e-01
                         :    2 : var1      : 3.128e-01
                         : ----------------------------------------
Factory                  : Train method: PDEFoam for Regression
                         : 
                         : Build mono target regression foam
                         : Elapsed time: 0.61 sec                                 
                         : Elapsed time for training with 1000 events: 0.617 sec         
                         : Dataset[dataset] : Create results for training
                         : Dataset[dataset] : Evaluation of PDEFoam on training sample
                         : Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.00455 sec       
                         : Create variable histograms
                         : Create regression target histograms
                         : Create regression average deviation
                         : Results created
                         : Creating xml weight file: dataset/weights/TMVARegression_PDEFoam.weights.xml
                         : writing foam MonoTargetRegressionFoam to file
                         : Foams written to file: dataset/weights/TMVARegression_PDEFoam.weights_foams.root
Factory                  : Training finished
                         : 
Factory                  : Train method: KNN for Regression
                         : 
KNN                      : <Train> start...
                         : Reading 1000 events
                         : Number of signal events 1000
                         : Number of background events 0
                         : Creating kd-tree with 1000 events
                         : Computing scale factor for 1d distributions: (ifrac, bottom, top) = (80%, 10%, 90%)
ModulekNN                : Optimizing tree for 2 variables with 1000 values
                         : <Fill> Class 1 has     1000 events
                         : Elapsed time for training with 1000 events: 0.00154 sec         
                         : Dataset[dataset] : Create results for training
                         : Dataset[dataset] : Evaluation of KNN on training sample
                         : Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.00743 sec       
                         : Create variable histograms
                         : Create regression target histograms
                         : Create regression average deviation
                         : Results created
                         : Creating xml weight file: dataset/weights/TMVARegression_KNN.weights.xml
Factory                  : Training finished
                         : 
Factory                  : Train method: LD for Regression
                         : 
LD                       : Results for LD coefficients:
                         : -----------------------
                         : Variable:  Coefficient:
                         : -----------------------
                         :     var1:      +42.509
                         :     var2:      +44.738
                         : (offset):      -88.627
                         : -----------------------
                         : Elapsed time for training with 1000 events: 0.000431 sec         
                         : Dataset[dataset] : Create results for training
                         : Dataset[dataset] : Evaluation of LD on training sample
                         : Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.000346 sec       
                         : Create variable histograms
                         : Create regression target histograms
                         : Create regression average deviation
                         : Results created
                         : Creating xml weight file: dataset/weights/TMVARegression_LD.weights.xml
Factory                  : Training finished
                         : 
Factory                  : Train method: BDTG for Regression
                         : 
                         : Regression Loss Function: Huber
                         : Training 2000 Decision Trees ... patience please
                         : Elapsed time for training with 1000 events: 1.64 sec         
                         : Dataset[dataset] : Create results for training
                         : Dataset[dataset] : Evaluation of BDTG on training sample
                         : Dataset[dataset] : Elapsed time for evaluation of 1000 events: 0.347 sec       
                         : Create variable histograms
                         : Create regression target histograms
                         : Create regression average deviation
                         : Results created
                         : Creating xml weight file: dataset/weights/TMVARegression_BDTG.weights.xml
                         : TMVAReg.root:/dataset/Method_BDT/BDTG
Factory                  : Training finished
                         : 
Factory                  : === Destroy and recreate all methods via weight files for testing ===
                         : 
                         : Reading weight file: dataset/weights/TMVARegression_PDEFoam.weights.xml
                         : Read foams from file: dataset/weights/TMVARegression_PDEFoam.weights_foams.root
                         : Reading weight file: dataset/weights/TMVARegression_KNN.weights.xml
                         : Creating kd-tree with 1000 events
                         : Computing scale factor for 1d distributions: (ifrac, bottom, top) = (80%, 10%, 90%)
ModulekNN                : Optimizing tree for 2 variables with 1000 values
                         : <Fill> Class 1 has     1000 events
                         : Reading weight file: dataset/weights/TMVARegression_LD.weights.xml
                         : Reading weight file: dataset/weights/TMVARegression_BDTG.weights.xml

Evaluate all mvas using the set of test events

In [35]:
factory->TestAllMethods();
Factory                  : Test all methods
Factory                  : Test method: PDEFoam for Regression performance
                         : 
                         : Dataset[dataset] : Create results for testing
                         : Dataset[dataset] : Evaluation of PDEFoam on testing sample
                         : Dataset[dataset] : Elapsed time for evaluation of 9000 events: 0.043 sec       
                         : Create variable histograms
                         : Create regression target histograms
                         : Create regression average deviation
                         : Results created
Factory                  : Test method: KNN for Regression performance
                         : 
                         : Dataset[dataset] : Create results for testing
                         : Dataset[dataset] : Evaluation of KNN on testing sample
                         : Dataset[dataset] : Elapsed time for evaluation of 9000 events: 0.092 sec       
                         : Create variable histograms
                         : Create regression target histograms
                         : Create regression average deviation
                         : Results created
Factory                  : Test method: LD for Regression performance
                         : 
                         : Dataset[dataset] : Create results for testing
                         : Dataset[dataset] : Evaluation of LD on testing sample
                         : Dataset[dataset] : Elapsed time for evaluation of 9000 events: 0.0032 sec       
                         : Create variable histograms
                         : Create regression target histograms
                         : Create regression average deviation
                         : Results created
Factory                  : Test method: BDTG for Regression performance
                         : 
                         : Dataset[dataset] : Create results for testing
                         : Dataset[dataset] : Evaluation of BDTG on testing sample
                         : Dataset[dataset] : Elapsed time for evaluation of 9000 events: 2.09 sec       
                         : Create variable histograms
                         : Create regression target histograms
                         : Create regression average deviation
                         : Results created

Evaluate and compare performance of all configured mvas

In [36]:
factory->EvaluateAllMethods();
Factory                  : Evaluate all methods
                         : Evaluate regression method: PDEFoam
                         : TestRegression (testing)
                         : Calculate regression for all events
                         : Elapsed time for evaluation of 9000 events: 0.0431 sec       
                         : TestRegression (training)
                         : Calculate regression for all events
                         : Elapsed time for evaluation of 1000 events: 0.00482 sec       
TFHandler_PDEFoam        : Variable        Mean        RMS   [        Min        Max ]
                         : -----------------------------------------------------------
                         :     var1:     3.3352     1.1893   [ 0.00020069     5.0000 ]
                         :     var2:     2.4860     1.4342   [ 0.00071490     5.0000 ]
                         :   fvalue:     163.91     83.651   [     1.6186     394.84 ]
                         : -----------------------------------------------------------
                         : Evaluate regression method: KNN
                         : TestRegression (testing)
                         : Calculate regression for all events
                         : Elapsed time for evaluation of 9000 events: 0.0939 sec       
                         : TestRegression (training)
                         : Calculate regression for all events
                         : Elapsed time for evaluation of 1000 events: 0.0105 sec       
TFHandler_KNN            : Variable        Mean        RMS   [        Min        Max ]
                         : -----------------------------------------------------------
                         :     var1:     3.3352     1.1893   [ 0.00020069     5.0000 ]
                         :     var2:     2.4860     1.4342   [ 0.00071490     5.0000 ]
                         :   fvalue:     163.91     83.651   [     1.6186     394.84 ]
                         : -----------------------------------------------------------
                         : Evaluate regression method: LD
                         : TestRegression (testing)
                         : Calculate regression for all events
                         : Elapsed time for evaluation of 9000 events: 0.0044 sec       
                         : TestRegression (training)
                         : Calculate regression for all events
                         : Elapsed time for evaluation of 1000 events: 0.000519 sec       
TFHandler_LD             : Variable        Mean        RMS   [        Min        Max ]
                         : -----------------------------------------------------------
                         :     var1:     3.3352     1.1893   [ 0.00020069     5.0000 ]
                         :     var2:     2.4860     1.4342   [ 0.00071490     5.0000 ]
                         :   fvalue:     163.91     83.651   [     1.6186     394.84 ]
                         : -----------------------------------------------------------
                         : Evaluate regression method: BDTG
                         : TestRegression (testing)
                         : Calculate regression for all events
                         : Elapsed time for evaluation of 9000 events: 2.15 sec       
                         : TestRegression (training)
                         : Calculate regression for all events
                         : Elapsed time for evaluation of 1000 events: 0.239 sec       
TFHandler_BDTG           : Variable        Mean        RMS   [        Min        Max ]
                         : -----------------------------------------------------------
                         :     var1:     3.3352     1.1893   [ 0.00020069     5.0000 ]
                         :     var2:     2.4860     1.4342   [ 0.00071490     5.0000 ]
                         :   fvalue:     163.91     83.651   [     1.6186     394.84 ]
                         : -----------------------------------------------------------
                         : 
                         : Evaluation results ranked by smallest RMS on test sample:
                         : ("Bias" quotes the mean deviation of the regression from true target.
                         :  "MutInf" is the "Mutual Information" between regression and target.
                         :  Indicated by "_T" are the corresponding "truncated" quantities ob-
                         :  tained when removing events deviating more than 2sigma from average.)
                         : --------------------------------------------------------------------------------------------------
                         : --------------------------------------------------------------------------------------------------
                         : dataset              BDTG           :   0.0707    0.102     2.45     1.95  |  3.100  3.175
                         : dataset              KNN            :   -0.237    0.578     5.17     3.44  |  2.898  2.939
                         : dataset              PDEFoam        :    0.106  -0.0677     9.22     7.74  |  2.283  2.375
                         : dataset              LD             :    0.461     2.22     19.6     17.6  |  1.985  1.979
                         : --------------------------------------------------------------------------------------------------
                         : 
                         : Evaluation results ranked by smallest RMS on training sample:
                         : (overtraining check)
                         : --------------------------------------------------------------------------------------------------
                         : DataSet Name:         MVA Method:        <Bias>   <Bias_T>    RMS    RMS_T  |  MutInf MutInf_T
                         : --------------------------------------------------------------------------------------------------
                         : dataset              BDTG           :   0.0597   0.0107    0.566    0.293  |  3.441  3.466
                         : dataset              KNN            :   -0.425    0.423     5.19     3.54  |  3.006  3.034
                         : dataset              PDEFoam        : 8.35e-07    0.106     8.04     6.57  |  2.488  2.579
                         : dataset              LD             :-1.03e-06     1.54     20.1     18.5  |  2.134  2.153
                         : --------------------------------------------------------------------------------------------------
                         : 
Dataset:dataset          : Created tree 'TestTree' with 9000 events
                         : 
Dataset:dataset          : Created tree 'TrainTree' with 1000 events
                         : 
Factory                  : Thank you for using TMVA!
                         : For citation information, please visit: http://tmva.sf.net/citeTMVA.html

Save the output

In [37]:
outputFile->Close();

std::cout << "==> Wrote root file: " << outputFile->GetName() << std::endl;
std::cout << "==> TMVARegression is done!" << std::endl;

delete factory;
delete dataloader;
==> Wrote root file: TMVAReg.root
==> TMVARegression is done!

Launch the gui for the root macros

In [38]:
if (!gROOT->IsBatch()) TMVA::TMVARegGui( outfileName );