green_ai_bench.hailo_inference
source module green_ai_bench.hailo_inference
Hailo-specific implementation of BaseInference for model inference on Hailo hardware.
Classes
-
HailoInference — Hailo-specific implementation of BaseInference for model inference on Hailo hardware.
source class HailoInference(model_path, **kwargs)
Bases : BaseInference
Hailo-specific implementation of BaseInference for model inference on Hailo hardware.
This class handles the setup, data generation, and inference for models running on Hailo accelerator hardware. It supports tracking power consumption during inference.
Initialize the HailoInference class.
Parameters
-
model_path : str — Path to the Hailo model file (.hef)
-
**kwargs — Additional arguments passed to BaseInference
Methods
-
setup_model — Initialize and configure the Hailo model for inference.
-
generate_data — Generate input data for the model using parameters from the input stream.
-
infer — Run inference on the Hailo device with emission tracking.
source method HailoInference.setup_model()
Initialize and configure the Hailo model for inference.
Returns
-
tuple — Contains: - NetworkGroup: The configured network group - Any: Network group parameters - InputVStreamParams: Input stream parameters - OutputVStreamParams: Output stream parameters
Raises
-
RuntimeError — If model initialization fails
source method HailoInference.generate_data()
Generate input data for the model using parameters from the input stream.
source method HailoInference.infer(infer_pipeline)
Run inference on the Hailo device with emission tracking.
Returns
-
Any — Model inference results