Class anira::OnnxRuntimeProcessor¶
-
class OnnxRuntimeProcessor : public anira::BackendBase¶
Inheritance diagram for anira::OnnxRuntimeProcessor:
Collaboration diagram for anira::OnnxRuntimeProcessor:
ONNX Runtime-based neural network inference processor.
The OnnxRuntimeProcessor class provides neural network inference capabilities using Microsoft’s ONNX Runtime. It supports loading ONNX models and performing real-time inference with optimized execution providers and parallel processing.
Warning
This class is only available when compiled with USE_ONNXRUNTIME defined
Public Functions
-
OnnxRuntimeProcessor(InferenceConfig &inference_config)¶
Constructs an ONNX Runtime processor with the given inference configuration.
Initializes the ONNX Runtime processor and creates the necessary number of parallel processing instances based on the configuration’s num_parallel_processors setting.
- Parameters:
inference_config – Reference to inference configuration containing model path, tensor shapes, and processing parameters
-
~OnnxRuntimeProcessor() override¶
Destructor that properly cleans up ONNX Runtime resources.
Ensures proper cleanup of all ONNX Runtime sessions, tensors, and allocated memory. All processing instances are safely destroyed with proper resource deallocation.
-
virtual void prepare() override¶
Prepares all ONNX Runtime instances for inference operations.
Loads the ONNX model into all parallel processing instances, allocates input/output tensors, and performs warm-up inferences if specified in the configuration.
Processes input buffers through the ONNX Runtime model.
Performs neural network inference using ONNX Runtime, converting audio buffers to ONNX tensors, executing the model, and converting results back to audio buffers.
- Parameters:
input – Vector of input buffers containing audio samples or parameter data
output – Vector of output buffers to receive processed results
session – Shared pointer to session element providing thread-safe instance access
-
OnnxRuntimeProcessor(InferenceConfig &inference_config)¶