Class anira::InferenceThread

class InferenceThread : public anira::HighPriorityThread

Inheritance diagram for anira::InferenceThread:

digraph {
    graph [bgcolor="#00000000"]
    node [shape=rectangle style=filled fillcolor="#FFFFFF" font=Helvetica padding=2]
    edge [color="#1414CE"]
    "2" [label="anira::HighPriorityThread" tooltip="anira::HighPriorityThread"]
    "1" [label="anira::InferenceThread" tooltip="anira::InferenceThread" fillcolor="#BFBFBF"]
    "1" -> "2" [dir=forward tooltip="public-inheritance"]
}

Collaboration diagram for anira::InferenceThread:

digraph {
    graph [bgcolor="#00000000"]
    node [shape=rectangle style=filled fillcolor="#FFFFFF" font=Helvetica padding=2]
    edge [color="#1414CE"]
    "2" [label="anira::HighPriorityThread" tooltip="anira::HighPriorityThread"]
    "3" [label="anira::InferenceData" tooltip="anira::InferenceData"]
    "1" [label="anira::InferenceThread" tooltip="anira::InferenceThread" fillcolor="#BFBFBF"]
    "1" -> "2" [dir=forward tooltip="public-inheritance"]
    "1" -> "3" [dir=forward tooltip="usage"]
}

High-priority thread class for executing neural network inference operations.

The InferenceThread class extends HighPriorityThread to provide a dedicated thread for executing neural network inference operations in real-time audio processing contexts. It manages a concurrent queue of inference requests and processes them with minimal latency while maintaining thread safety and real-time performance guarantees.

Note

This class inherits from HighPriorityThread and automatically manages thread priority elevation for optimal real-time performance.

Public Functions

InferenceThread(moodycamel::ConcurrentQueue<InferenceData> &next_inference)

Constructor that initializes the inference thread with a task queue.

Creates an inference thread that will process inference requests from the provided concurrent queue. The thread is not started automatically and must be explicitly started using the start() method.

Constructor that initializes the inference thread with a task queue

Creates an inference thread that will process inference requests from the provided concurrent queue. The thread is not started automatically and must be explicitly started using the start() method.

Parameters:
  • next_inference – Reference to a thread-safe concurrent queue containing inference data structures to process

  • next_inference – Reference to a thread-safe concurrent queue containing inference data structures to process

~InferenceThread() override

Destructor that ensures proper cleanup of thread resources.

Automatically stops the inference thread if it’s still running and performs cleanup of any remaining inference data or resources.

bool execute()

Executes a single iteration of inference processing.

Attempts to dequeue and process one inference request from the queue. This method is designed to be called repeatedly in a loop and provides efficient processing with automatic backoff when no work is available.

The method handles:

  • Dequeuing inference data from the concurrent queue

  • Processing the inference request through the appropriate session

  • Managing CPU usage through exponential backoff strategies

  • Thread-safe access to shared data structures

Note

This method is real-time safe and designed for repeated calls in a high-frequency processing loop.

Returns:

True if an inference operation was executed, false if no work was available