zea.models.lv_segmentation

nnU-Net segmentation model trained on the augmented CAMUS dataset.

To try this model, simply load one of the available presets:

>>> from zea.models.lv_segmentation import AugmentedCamusSeg

>>> model = AugmentedCamusSeg.from_preset("augmented_camus_seg")

The model segments both the left ventricle and myocardium.

At the time of writing (17 September 2025) and to the best of our knowledge, it is the state-of-the-art model for left ventricle segmentation on the CAMUS dataset.

Important

This is a zea implementation of the model. For the original paper and code, see here.

Van De Vyver, Gilles, et al. “Generative augmentations for improved cardiac ultrasound segmentation using diffusion models.” https://arxiv.org/abs/2502.20100

See also

A tutorial notebook where this model is used: Left ventricle segmentation.

Note

The model is originally a PyTorch model converted to ONNX. To use this model, you must have onnxruntime installed. This is required for ONNX model inference.

You can install it using pip:

pip install onnxruntime

Classes

AugmentedCamusSeg(*args, **kwargs)

nnU-Net based left ventricle and myocardium segmentation model.

class zea.models.lv_segmentation.AugmentedCamusSeg(*args, **kwargs)[source]

Bases: BaseModel

nnU-Net based left ventricle and myocardium segmentation model.

  • Trained on the augmented CAMUS dataset.

  • This class loads an ONNX model and provides inference for cardiac ultrasound segmentation tasks.

call(inputs)[source]

Run inference on the input data using the loaded ONNX model.

Parameters:

inputs (np.ndarray) – Input image or batch of images for segmentation. Shape: [batch, 1, 256, 256] Range: Any numeric range; normalized internally.

Returns:

Segmentation mask(s) for left ventricle and myocardium.

Shape: [batch, 3, 256, 256] (logits for background, LV, myocardium)

Return type:

np.ndarray

Raises:

ValueError – If model weights are not loaded.

custom_load_weights(preset, **kwargs)[source]

Load the ONNX weights for the segmentation model.