T M V A S O F I E O N N X

This macro provides a simple example for the parsing of ONNX files into RModel object and further generating the .hxx header files for inference.

Author: Sanjiban Sengupta
This notebook tutorial was automatically generated with ROOTBOOK-izer from the macro found in the ROOT repository on Thursday, December 09, 2021 at 10:49 AM.

In [1]:
using namespace TMVA::Experimental;
In [2]:
//Creating parser object to parse ONNX files
 SOFIE::RModelParser_ONNX Parser;
 SOFIE::RModel model = Parser.Parse("../../tmva/sofie/test/input_models/Linear_16.onnx");

 //Generating inference code
 model.Generate();
 model.OutputGenerated("Linear_16.hxx");

 //Printing required input tensors
 model.PrintRequiredInputTensors();

 //Printing initialized tensors (weights)
 std::cout<<"\n\n";
 model.PrintInitializedTensors();

 //Printing intermediate tensors
 std::cout<<"\n\n";
 model.PrintIntermediateTensors();

 //Checking if tensor already exist in model
 std::cout<<"\n\nTensor \"16weight\" already exist: "<<std::boolalpha<<model.CheckIfTensorAlreadyExist("16weight")<<"\n\n";
 std::vector<size_t> tensorShape = model.GetTensorShape("16weight");
 std::cout<<"Shape of tensor \"16weight\": ";
 for(auto& it:tensorShape){
     std::cout<<it<<",";
 }
 std::cout<<"\n\nData type of tensor \"16weight\": ";
 SOFIE::ETensorType tensorType = model.GetTensorType("16weight");
 std::cout<<SOFIE::ConvertTypeToString(tensorType);

 //Printing generated inference code
 std::cout<<"\n\n";
 model.PrintGenerated();
input_line_48:3:9: error: no member named 'RModelParser_ONNX' in namespace 'TMVA::Experimental::SOFIE'
 SOFIE::RModelParser_ONNX Parser;
 ~~~~~~~^
input_line_48:4:24: error: use of undeclared identifier 'Parser'
 SOFIE::RModel model = Parser.Parse("../../tmva/sofie/test/input_models/Linear_16.onnx");
                       ^