{ "cells": [ { "cell_type": "markdown", "id": "0a50eb7a", "metadata": {}, "source": [ "# Color Doppler ultrasound" ] }, { "cell_type": "markdown", "id": "8dc39619", "metadata": {}, "source": [ "In this notebook, we demonstrate how to process and visualize Color Doppler ultrasound data using the `zea` library. Doppler ultrasound is a non-invasive imaging technique that measures the frequency shift of ultrasound waves reflected from moving objects, such as blood flow in vessels." ] }, { "cell_type": "markdown", "id": "b5967e4d", "metadata": {}, "source": [ "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/tue-bmd/zea/blob/main/docs/source/notebooks/pipeline/doppler_example.ipynb)\n", " \n", "[![View on GitHub](https://img.shields.io/badge/GitHub-View%20Source-blue?logo=github)](https://github.com/tue-bmd/zea/blob/main/docs/source/notebooks/pipeline/doppler_example.ipynb)\n", " \n", "[![Hugging Face dataset](https://img.shields.io/badge/Hugging%20Face-Dataset-yellow?logo=huggingface)](https://huggingface.co/datasets/zeahub/zea-rotating-disk)" ] }, { "cell_type": "markdown", "id": "10d6235b", "metadata": {}, "source": [ "‼️ **Important:** This notebook is optimized for **GPU/TPU**. Code execution on a **CPU** may be very slow.\n", "\n", "If you are running in Colab, please enable a hardware accelerator via:\n", "\n", "**Runtime → Change runtime type → Hardware accelerator → GPU/TPU** 🚀." ] }, { "cell_type": "code", "execution_count": 1, "id": "f48dd0d1", "metadata": {}, "outputs": [], "source": [ "%%capture\n", "%pip install zea" ] }, { "cell_type": "code", "execution_count": 2, "id": "e27aafe7", "metadata": {}, "outputs": [], "source": [ "import os\n", "\n", "os.environ[\"KERAS_BACKEND\"] = \"tensorflow\"\n", "os.environ[\"ZEA_DISABLE_CACHE\"] = \"1\"\n", "os.environ[\"ZEA_LOG_LEVEL\"] = \"INFO\"" ] }, { "cell_type": "markdown", "id": "b53f4f5f", "metadata": {}, "source": [ "We'll import all necessary libraries and modules." ] }, { "cell_type": "code", "execution_count": 3, "id": "6613ddc1", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\u001b[1m\u001b[38;5;36mzea\u001b[0m\u001b[0m: Using backend 'tensorflow'\n" ] } ], "source": [ "import matplotlib.pyplot as plt\n", "\n", "import zea\n", "from zea.doppler import color_doppler\n", "import numpy as np\n", "from zea import init_device\n", "from zea.visualize import set_mpl_style\n", "from zea.internal.notebooks import animate_images" ] }, { "cell_type": "markdown", "id": "de5d19da", "metadata": {}, "source": [ "We'll use the following parameters for this experiment." ] }, { "cell_type": "code", "execution_count": 4, "id": "4090728b", "metadata": { "tags": [ "parameters" ] }, "outputs": [], "source": [ "n_frames = 25\n", "n_transmits = 10" ] }, { "cell_type": "markdown", "id": "988cce07", "metadata": {}, "source": [ "We will work with the GPU if available, and initialize using `init_device` to pick the best available device. Also, (optionally), we will set the matplotlib style for plotting." ] }, { "cell_type": "code", "execution_count": 5, "id": "33d3a34d", "metadata": {}, "outputs": [], "source": [ "init_device(verbose=False)\n", "set_mpl_style()" ] }, { "cell_type": "markdown", "id": "5ee73c08", "metadata": {}, "source": [ "## Loading data\n", "To start, we will load some data from the [zea rotating disk](https://huggingface.co/datasets/zeahub/zea-rotating-disk) dataset, which is stored for convenience on the [Hugging Face Hub](https://huggingface.co/zeahub). You could also easily load your own data in zea format, using a local path instead of the HF URL.\n", "\n", "For more ways and information to load data, please see the [Data documentation](../../data-acquisition.rst) or the data loading example notebook [here](../data/zea_data_example.ipynb).\n", "\n", "Note that all acquisition parameters are also stored in the zea data format, such that when we load the data we can also construct `zea.Probe` and `zea.Scan` objects, that will be usefull later on in the pipeline." ] }, { "cell_type": "code", "execution_count": 6, "id": "4d935024", "metadata": {}, "outputs": [ { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "4b972618d53f4c799338d81cf1220440", "version_major": 2, "version_minor": 0 }, "text/plain": [ "L115V_1radsec.hdf5: 0%| | 0.00/2.34G [00:00" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "pulse_repetition_frequency = 1 / sum(scan.time_to_next_transmit[0])\n", "d = color_doppler(\n", " data4doppler,\n", " probe.center_frequency,\n", " pulse_repetition_frequency,\n", " scan.sound_speed,\n", " hamming_size=10, # spatial smoothing with Hamming window\n", ")\n", "plt.imshow(d * 100, cmap=\"bwr\", extent=scan.extent * 1e3)\n", "plt.title(\"Doppler image (cm/s)\")\n", "plt.xlabel(\"X (mm)\")\n", "plt.ylabel(\"Z (mm)\")\n", "plt.colorbar()\n", "plt.show()" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.12.11" } }, "nbformat": 4, "nbformat_minor": 5 }