ImageDev

ReadOnnxModel

Loads an ONNX model in a persistant object in memory.

Access to parameter description

See also

Function Syntax

This function returns outputOnnxModel.
// Function prototype
OnnxModel::Ptr readOnnxModel( const std::string& filePath, bool forceCpu );

// Function deprecated
// Please check the release notes to see when it will be permanently removed.
OnnxModel::Ptr readOnnxModel( const std::string& filePath );
This function returns outputOnnxModel.
// Function prototype.
read_onnx_model(file_path: str = "",
                force_cpu: bool = False) -> OnnxModel
This function returns outputOnnxModel.
// Function prototype.
public static OnnxModel
ReadOnnxModel( String filePath = "",
               bool forceCpu = false );

Class Syntax

Parameters

Parameter Name Description Type Supported Values Default Value
input
filePath
The path of the ONNX model file. String ""
input
forceCpu
If True, forces the model to load into CPU memory even if a compatible GPU is available.
This parameter has been added in 2025.5. Please check the release notes to see when it will be mandatory in the function prototype.
This can be useful for debugging, running lightweight environments, or avoiding GPU memory allocation.
Bool false
output
outputOnnxModel
The Object containing the ONNX model. OnnxModel nullptr
Parameter Name Description Type Supported Values Default Value
input
file_path
The path of the ONNX model file. string ""
input
force_cpu
If True, forces the model to load into CPU memory even if a compatible GPU is available.
This parameter has been added in 2025.5. Please check the release notes to see when it will be mandatory in the function prototype.
This can be useful for debugging, running lightweight environments, or avoiding GPU memory allocation.
bool False
output
output_onnx_model
The Object containing the ONNX model. OnnxModel None
Parameter Name Description Type Supported Values Default Value
input
filePath
The path of the ONNX model file. String ""
input
forceCpu
If True, forces the model to load into CPU memory even if a compatible GPU is available.
This parameter has been added in 2025.5. Please check the release notes to see when it will be mandatory in the function prototype.
This can be useful for debugging, running lightweight environments, or avoiding GPU memory allocation.
Bool false
output
outputOnnxModel
The Object containing the ONNX model. OnnxModel null

Object Examples


ReadOnnxModel readOnnxModelAlgo;
readOnnxModelAlgo.setFilePath( std::string( IMAGEDEVDATA_OBJECTS_FOLDER ) + "noise2noise.onnx" );
readOnnxModelAlgo.setForceCpu( false );
readOnnxModelAlgo.execute();

std::cout << "path: " << readOnnxModelAlgo.outputOnnxModel()->path( ) ;

read_onnx_model_algo = imagedev.ReadOnnxModel()
read_onnx_model_algo.file_path = imagedev_data.get_object_path("noise2noise.onnx")
read_onnx_model_algo.force_cpu = False
read_onnx_model_algo.execute()

print("path: ", str(read_onnx_model_algo.output_onnx_model.path()))

ReadOnnxModel readOnnxModelAlgo = new ReadOnnxModel
{
    filePath = @"Data/objects/noise2noise.onnx",
    forceCpu = false
};
readOnnxModelAlgo.Execute();

Console.WriteLine( "path: " + readOnnxModelAlgo.outputOnnxModel.path( ) );

Function Examples


auto result = readOnnxModel( std::string( IMAGEDEVDATA_OBJECTS_FOLDER ) + "noise2noise.onnx", false );

std::cout << "path: " << result->path( ) ;

result = imagedev.read_onnx_model(imagedev_data.get_object_path("noise2noise.onnx"), False)

print("path: ", str(result.path()))

OnnxModel result = Processing.ReadOnnxModel( @"Data/objects/noise2noise.onnx", false );

Console.WriteLine(  "path: " + result.path( )  );



© 2026 Thermo Fisher Scientific Inc. All rights reserved.