There has been an internal effort at Kitware Europe to simplify the use of segmentation models available as MONAI bundles from 3D Slicer.
The goal of this post is to gather some feedback about the best way to share this work.
For this purpose, we are working on a module that allows users to load and run segmentation models from a bundle available locally, or download one from the MONAI model zoo. Our intention is to make this work available to the community, and we initially planned to create a new 3D Slicer extension for this.
However, we realized that our work present an overlap with what offers the SlicerMONAIAuto3DSeg extension (@lassoan), which is why are now pondering whether contributing our work to this project would make more sense. A new extension might be redundant, as both projects aim to provide an easy way to run inference for Monai segmentation models from within 3D Slicer.
Here are some of the features that we could bring, that are not available in SlicerMONAIAuto3DSeg as far as we are aware:
support models distributed as Monai bundles (instead of Auto3DSeg)
load local bundles from disk
retrieve bundles from the Monai Model Zoo
Note that, like SlicerMONAIAuto3DSeg, there are limitations to our work due to the assumptions we need to make about the model (MONAI bundle being very permissive and general, to allow for many different architectures, input/outputs, etc.).
The features you described could be very easily added to the MONAIAuto3DSeg extension (and we have been considering adding running from local folder anyway), so collaborating on this extension would make the most sense.
@diazandr3s what do you think? If thr scope of the extension will be broadened then we may need to rename the extension.
@rfenioux If you send a pull request with a runner script for MONAI bundle models and code snippet to download a model from the MONAI Model Zoo then I can take care of the rest.
Note that in this extension we have achieved two important goals:
democratizing AI segmentation: we do not require a GPU, low-resolution version is provided for each model that runs on the CPU in few minutes
describe segmentation output using standard terminology codes (using some arbitary common English name is not suitable for clinical use)
It would be nice if the newly added models could meet these standards, but it is not a dealbreaker if they don’t (we could add filtering option to hide models that only work on GPU or do not properly specify their segmentation output).
I agree with @lassoan, MONAI Bundles can be easily integrated/added to the MONAI Auto3DSeg extension - we might need to change the name once the extension can consume bundles.
MONAI Bundles allow you to do inference and training. So far MONAIAuto3DSeg has been designed for inference only (both in CPU and GPU). We assume the training is done with the Auto3DSeg in a separate cycle.
There are Bundles for different types of image modalities: endoscopy (video), pathology and radiology (CT and MR). Within radiology, there are bundles for detection, segmentation and landmark detection.
Here are the current ones for radiology applications:
I’d suggest starting with the bundles for segmentation in radiology and the ones on a single modality/sequence. Once this is working, we can move to multimodality and then detection.
I’ll be more than happy to further discuss this over a video call.
Thank you both for your valuable inputs, I’m glad to see our intuition confirmed.
Strictly enforcing these rules requires having control over the available models I think, which is not the case when running locally stored bundles. It’s also not necessarily appropriate for this use case, since people may want to run custom bundles (e.g. fine tuned on their specific tasks).
However filtering would make sense for models retrieved from the model zoo (we’ll need to filter for supported bundles anyway).
I fully agree. Inference, and especially segmentation is the task that benefits most from being integrated in a 3D Slicer workflow (and conversely, it is probably what most 3D Slicer users are interested in).