SlicerRT Batch Conversation Memory issue

I am trying to use the batch conversion tool with SlicerRT and I’ve been getting memory issues. I am trying to use it to convert label maps from a Struct file and an accompanying CT scan. Is there any reason why I would be getting this type of issue and how much computation memory is required? I am running it on a PC with 32 GB of RAM, so not an insignificant amount.

If you set a large oversampling value, extents are large, or you have many segments then memory usage can be excessive.

Try latest Slicer Preview Release, as it contains many memory optimizations.

If you still have issues then attach the application log (menu: Help / Report a bug) obtained with latest Slicer Preview Release.

I’m using one of the lastest nightly previews to do it (maybe just a week ago). I’m not sure what the issue is, or if it is a slicer bug, a bug with the extension, or if this behaviour is expected.

Attach the application log (menu: Help / Report a bug) of a session where you run out of memory. It may help us to answer what causing the issue and how to resolve it.

I have the log, what is the best way to share it (its too large to copy-and-paste)?

You can use or any cloud storage provider (Dropbox, OneDrive, Google drive, etc) and post the link here.

Thank you, these logs are already useful, but it is hard to tell for sure what’s wrong. Can you share the structure set files (of anonymized or phantom data sets)?

@Sunderlandkyl do any of the errors shown in the log look familiar to you?

No, it’s not an error that I’ve seen before.

Is this something that occurs with any data you want to convert or one specific dataset?

Here is a link to the file. Sorry for the delay, I just had to make sure the file was properly deidentified.

I’ve only been trying right now with a specific dataset, I haven’t tried with others. That being said, there shouldn’t be anything specifically different with this dataset compared to others I have.

What is the resolution of the accompanying CT scan?

The CT scan is a 512 x 512 x 296 volume with voxel spacing 1.171 x 1.171 x 1 mm^3

That is quite standard, it shouldn’t be a problem.

I wouldn’t be so sure. Maybe there is somethig going on with one of the contours that confuses our conversion algorithm and makes it allocate way too much memory.

Thanks for the data! I think the only way to see it is to try it ourselves. @Sunderlandkyl do you have some time checking it out? You know way more about the contour conversion algorithm specifically, so if my hunch is right then you’re in a better position to find out the root of the problem. Please let me know if you don’t, and then I’ll give it a go when I can.

I know in the Struct file there are a lot of contours. However, there are only a few that I am actually interested in. Is there a way in the batch conversion tool to specify which labels I am actually interested in?

Yes, I can take a look at it.

I’ve already looked at the contour to surface conversion in Slicer, and some of the structures were not converted correctly.
I think this is causing some additional issues in the closed surface to binary labelmap conversion, but I’ll debug and find out.

1 Like

The following structures: TS+2mmring, TS+4mmring, outer tissue ring, and bowel max20 are all not triangulated correctly from the contours, however that doesn’t seem to be the issue.

2 things seem to be happening:

  • The memory usage when collapsing the binary labelmaps to a single layer is very high
  • After collapsing the labelmaps, the closed surface to binary labelmap conversion is run again.

I’m working on a fix.


This should be fixed in tomorrows preview release (c4fe6c5).
The reference geometry wasn’t being applied to the segmentation.