Problem in using segmentmesher

Hello,
I encountered a problem while using segmentmesher. I used slicer 4.10.1 in the past and now I am moving to the newest stable version. I found that the segmentmesher in the previous version needs considerably less RAM than the one in the latest version; I checked this with multiple examples and I can post some of them with details here if necessary. I should mention that I use the same settings for segmentmesher on both versions.
Any ideas on where this problem is coming from?
It is true that we are always encouraged to use the newest version but in this case, I could use old segmentmesher for generating mesh for my model while the new version needs a lot of RAM which leads to killing the process due to lack of RAM on my system. On the other hand, I have to use the newest version to benefit from the updates.
Any helps on this is greatly appreciated.
I have attached the python codes of segment mesher of both version in case they can be helpful.

https://github.com/A-ep93/SlicerTest/blob/main/compare.zip?raw=true

In Slicer-4.11.20200930 we upgraded to latest Cleaver2 version (in Slicer-4.10.2 we use a 3 years old version). Some command-line arguments slightly changed, as spacing is now correctly taken into account, while previously it was ignored (and the parameters were interpreted in voxel space).

You can get the same quality results with the same memory usage as before, but you may need to adjust the --feature-scaling, --sampling-rate, and --lipschitz parameters as described here.

Thanks for your response @lassoan. I know that you updated the cleaver for the new version. In fact, I was one of the users who asked for it :slightly_smiling_face:.
I know that the command lines are changed and I considered that. But there I think there is something else that causing the problem. I even checked it with another version (4.11.0). In that version, the cleaver is not still updated, but it still needs more RAM in comparison to the 4.10.1. Having this in mind, do you have any idea on what can cause this problem?

You can ask Cleaver developers, but I don’t think there was a change in the algorithm that would have increased the memory usage. All Cleaver changes looked mostly cosmetics and bugfixes, nothing fundamental.

Thanks for your response @lassoan. Is there any way to find the changes of segmentmesher from time to time?
It can help me to track the changes and better explaining it to cleaver developers.

It’s all on github, so you can compare any versions very easily. The Cleaver upgrade commit is here: Update Cleaver2 to latest master version · lassoan/SlicerSegmentMesher@2c42c47 · GitHub

As you can see, the only relevant changes are updating Cleaver2 version (repository and git hash) and command-line argument names and default values.

1 Like

Hello @lassoan. I did lots of trial and errors in different versions. I found that adding these lines results in increasing the RAM needed for calculations:

# Set reference geometry in labelmapVolumeNode
referenceGeometry_Segmentation = slicer.vtkOrientedImageData()
inputSegmentation.GetSegmentation().SetImageGeometryFromCommonLabelmapGeometry(referenceGeometry_Segmentation, None,
  slicer.vtkSegmentation.EXTENT_REFERENCE_GEOMETRY)
slicer.modules.segmentations.logic().CopyOrientedImageDataToVolumeNode(referenceGeometry_Segmentation, labelmapVolumeNode)

# Add margin
extent = labelmapVolumeNode.GetImageData().GetExtent()
paddedExtent = [0, -1, 0, -1, 0, -1]
for axisIndex in range(3):
  paddingSizeVoxels = int((extent[axisIndex * 2 + 1] - extent[axisIndex * 2]) * paddingRatio)
  paddedExtent[axisIndex * 2] = extent[axisIndex * 2] - paddingSizeVoxels
  paddedExtent[axisIndex * 2 + 1] = extent[axisIndex * 2 + 1] + paddingSizeVoxels
labelmapVolumeNode.GetImageData().SetExtent(paddedExtent)
labelmapVolumeNode.ShiftImageDataExtentToZeroStart()

I deleted this in several versions up to 4.11.0 and it worked without error and resulted in considerably lower RAM usage. However, in the last version (4.11.20200930), when I delete these lines, the segmentmesher does not work and give this error:

Nrrd file read error: No zero crossing in indicator function. Not a valid file or need a lower sigma value.

Now I have two questions:

  1. Why I get an error in the last version when I delete these lines while other versions work well?
  2. Can we modify these codes to result in lower RAM usage without deleting them?

Your help is greatly appreciated.

Exporting the segmentation to labelmap and padding of the image should not result in significant memory increase, as the whole image should not be more than a few hundred MB in size and you are expected to have at least 5-10GB of free memory.

Nrrd file read error: No zero crossing in indicator function. Not a valid file or need a lower sigma value.

Most likely you get this error because you are using an old Cleaver version (where indicator functions are used by default) with a new Slicer version (that does not set the --segmentation option because it does not exist in current Cleaver version).

If you want to experiment with various older Cleaver versions then you probably need to do it by using the command-line. You can find all the Cleaver command-line options that Slicer uses in the application log and you have the option in Segment Mesher to preserve data sets that it provided to Cleaver.

Thanks for your response @lassoan.
I checked the nrrd file created during using segmentmesher in my temp folder. Its size is about 2.1 gb and I think it is very large. I have 64 GB of RAM but I am not still able to get the mesh I want. Is there any way to change my segmentation so that it results in smaller labelmap file?
This way maybe I do not need even to delete that code lines and I can use the segmentmesher as it is.
Thanks in advance

2.1GB image is very big for a patient image. I guess you oversampled the image so that you can create margins more accurately. What is the spacing of the volume and number of voxels along each axis?

So my problem is definitely related to this large volume. Spacing is 1mm in all directions and dimensions are 1012,1024,1014.
Are there anyways so that I can reduce my generated label maps size without the need to do all segmentations again?

Hello again @lassoan. I could reduce the size of my current segmentatin by changing the master volume to a smaller one in segment editor. Now my generated label map is about 500mb.
I reinstalled everything (slicer + segmentmesher) but gain, I get this error:

Generate mesh using: /home/user/.config/NA-MIC/Extensions- 
29402/SegmentMesher/lib/Slicer-4.11/cleaver-cli: ['--input_files', '/tmp/Slicer- 
user/SegmentMesher/20201027_210026_627/inputLabelmap.nrrd', '--output_path', 
'/tmp/Slicer-user/SegmentMesher/20201027_210026_627/', '--output_format', 
'vtkUSG', '--fix_tet_windup', '--strip_exterior', '--verbose', '--feature_scaling', '2', '-- 
 sampling_rate', '0.2']

Nrrd file read error: No zero crossing in indicator function. Not a valid file or need a lower sigma value.

Command 'cleaver-cli' returned non-zero exit status 11.

I am using the last stable version of slicer with the last segmentmesher. So what is causing this error now?

Thanks in advance for your help.