zea.models.regional_quality¶
MobileNetv2 based image quality model for myocardial regions in apical views.
To try this model, simply load one of the available presets:
>>> from zea.models.regional_quality import MobileNetv2RegionalQuality
>>> model = MobileNetv2RegionalQuality.from_preset("mobilenetv2_regional_quality")
The model predicts the regional image quality of the myocardial regions in apical views. It can also be used to get the overall image quality by averaging the regional scores.
At the time of writing (17 September 2025) and to the best of our knowledge, it is the state-of-the-art model for left ventricle segmentation on the CAMUS dataset.
Important
This is a zea implementation of the model.
For the original paper and code, see here.
Van De Vyver, et al. “Regional Image Quality Scoring for 2-D Echocardiography Using Deep Learning.” Ultrasound in Medicine & Biology 51.4 (2025): 638-649
See also
A tutorial notebook where this model is used: Myocardial image quality estimation.
Note
The model is originally a PyTorch model converted to ONNX. To use this model, you must have onnxruntime installed. This is required for ONNX model inference.
You can install it using pip:
pip install onnxruntime
Classes
|
MobileNetV2 based regional image quality scoring model for myocardial regions in apical views. |
- class zea.models.regional_quality.MobileNetv2RegionalQuality(*args, **kwargs)[source]¶
Bases:
BaseModelMobileNetV2 based regional image quality scoring model for myocardial regions in apical views.
This class loads an ONNX model and provides inference for regional image quality scoring tasks.
- call(inputs)[source]¶
Predict regional image quality scores for input image(s).
- Parameters:
inputs (np.ndarray) – Input image or batch of images.
Shape – [batch, 1, 256, 256]
- Returns:
- Regional quality scores.
Shape is [batch, 8] with regions in order: basal_left, mid_left, apical_left, apical_right, mid_right, basal_right, annulus_left, annulus_right
- Return type:
np.ndarray