Struct anira::ModelData¶
-
struct ModelData¶
Container for neural network model data and metadata.
The ModelData struct encapsulates all information necessary to load and identify a neural network model for inference. It supports both binary model data (loaded from files) and string-based model paths, along with backend-specific metadata.
See also
Public Functions
-
inline ModelData(void *data, size_t size, InferenceBackend backend, const std::string &model_function = "", bool is_binary = true)¶
Constructs ModelData with binary data or model path.
- Parameters:
data – Pointer to model data (binary data) or model path string
size – Size of the data in bytes (for binary) or string length (for paths)
backend – The inference backend that will use this model
model_function – Optional function name within the model (LibTorch only)
is_binary – Whether the data represents binary model data (true) or a file path (false)
-
inline ModelData(const std::string &model_path, InferenceBackend backend, const std::string &model_function = "", bool is_binary = false)¶
Constructs ModelData from a file path string.
Convenience constructor for creating ModelData from a model file path. The path string is copied internally and managed by the ModelData instance.
- Parameters:
model_path – Path to the model file on disk
backend – The inference backend that will use this model
model_function – Optional function name within the model (LibTorch only)
is_binary – Whether to treat the path as binary data (typically false for file paths)
-
inline ModelData(const ModelData &other)¶
Copy constructor with proper memory management.
Creates a deep copy of the ModelData, ensuring independent memory management for non-binary data while sharing binary data references safely.
- Parameters:
other – The ModelData instance to copy from
-
inline ModelData &operator=(const ModelData &other)¶
Assignment operator with proper memory management.
Safely assigns one ModelData to another, handling memory deallocation and reallocation as needed for non-binary data.
-
inline ~ModelData()¶
Destructor with automatic memory cleanup.
Automatically frees allocated memory for non-binary data to prevent memory leaks. Binary data is assumed to be managed externally and is not freed.
Public Members
-
void *m_data¶
Pointer to model data (binary data or string data)
-
size_t m_size¶
Size of the model data in bytes.
-
InferenceBackend m_backend¶
Target inference backend for this model.
-
std::string m_model_function¶
Function name within the model (LibTorch specific)
-
bool m_is_binary¶
Whether the data represents binary model data.
-
inline ModelData(void *data, size_t size, InferenceBackend backend, const std::string &model_function = "", bool is_binary = true)¶