Slicer crashing before running out of memory

First off, thanks Murat and Andras for all your work in developing this program and offering so much help to the community in using it!

I’ve been using slicer for a few years to do micro-CT scans of bones, largely without issue. However, lately I’ve started working with DICE-CT images and have been increasingly running into issues where the computer crashes while there still appears to be plenty of RAM available.
For reference, I am running a Windows 11 system with 128gb of memory as well as a good processor and graphics card (24GB of RAM on the card).

Here is a link to one of the datasets that has given me trouble:
https://drive.google.com/drive/folders/1iSmk92HOYNRLLgk7SIuiC3D6YI9RCdtQ?usp=sharing

I also included two log files of examples of times when the program crashed. Every time it crashes there is no record of the crash in the log files as far as I can tell, but I might just not be seeing it.

I have had issues at many points in the process. It has crashed when segmenting (specifically when using the flood fill tool with “show 3D” turned off). It has also crashed when I tried the “create models from visible segments” option. And finally, it sometimes crashes when I try to load the MRML scene file.

Throughout all of this I have been able to monitor my memory usage, and the RAM is never full when it crashes. It has often been crashing when I am using ~80GB of RAM, but sometimes it is higher or lower.

I updated all of my drivers and ran diagnostics on the memory sticks, but none of it indicates any hardware problems or helps the issue. I know that the file is pretty large, but I’m nowhere near maxing out my hardware so I’m not sure why that would be a problem.

Thanks for any thoughts or suggestions!
Dave

So I tried this data with full resolution in a powerful cloud instance, and I cannot reproduce a crash during segmentation and/or model export. I do not know exactly what you are doing as processing, so it would be better if you can share the scene as MRB prior to crash and tell us what the step is.

The log files you provided has no information about crash.

The most important advice I would give is that you don’t need to segment the image at full resolution. Goal of the segmentation is to just designate regions of the image and you don’t have to capture all small details in the segmentation. If you downsample the input image by a factor of 2x … 6x along each axis (using Crop volume module) you will get a small volume that is still perfectly suitable for designating regions in your image. This results in 10x … 200x less memory usage and approximately that much speedup of various operations (i.e., an operation that took a minute will just take a second). When your segmentation is ready, you can combine the low-resolution segmentation with your full-resolution volume using Colorize volume module.

For example, after performing some random segmentation at 6x lower resolution and using that for coloring of the original CT:

About the crashes: I cannot reproduce any crash on Windows with the latest Slicer Preview Release, with 64 GB RAM. What operating system do you use? Did the application actually crash or just hang? Do you load the image using ImageStacks module or just the default “Add data” window? Do you convert these JPEG images from RGB to grayscale? I could not get over 40GB memory usage - could you provide step-by-step instructions on what you did exactly to consume that much memory?

Thank you both for your responses! I didn’t realize that you could apply a low resolution segmentation to a high resolution volume, that is incredibly useful to know.

I tried to recreate the issue so I could save the scene file, but it appears to have gotten worse. Previously the program would just close itself with no warning or error message displayed (which explains why the logs don’t show any errors). Now it is displaying the following error if I try to save any segmentation as a model or if I try to show a segmentation in 3D:

So the current workflow (on Windows 11) that is repeatably creating the above error is that I use the ImageStacks module to load in the volume in B&W at full resolution with isotropic scaling. Then I go into the segment editor and create a new segment where I apply the threshold tool (80-255). Then I go back to the data module, right click on the segment, and click “export visible segments to model”. That’s when it throws the error “Slicer has caught an application error, please save your work and restart.\n\nThe application has run out of memory. Increasing virtual memory size in system settings or adding more RAM may fix this issue.\n\nIf you have a repeatable sequence of steps that causes this message, please report the issue following instructions available at https://slicer.org\n\n\nThe message detail is:\n\nException thrown in event: bad array new length”. Previously it would think for a few minutes and then fully crash and close out the program.

As you can see, it was not using much memory when this occurred (29/128GB). Does this mean that it’s likely to be a hardware issue?

Hardware issues are extremely rare. Most common root cause of problems is between the keyboard and display or sometimes in the software.

In this case, it is a mix of a few things.

First of all, thresholding can create a very noisy surface, with hundreds of millions points and occluding relevant details. So, after threeholding of noisy data always apply a smoothing filter.

You’ve also discovered an issue in the VTK library’s vtkWindowedSincPolyDataFilter - the smoothing filter that is used on the surface extraction output. For certain size of large (but not too large inputs) it uses incorrect ID type, which causes integer overflow that leads to the exception that you saw in the popup. We’ll work with VTK developers to fix it. Until then, you can disable surface smoothing or use surface nets smoothing, which is not impacted by the vtkWindowedSincPolyDataFilter bug.

By enabling “Experimental / Use surface nets” option in the “Show 3D” button menu you also make surface generation about 10x faster.

Update:

Thresholding between 80-255 created a model that consists of over 180 million points and 360 million cells. An average model is under 100 thousand points, so this model is unusually large.

I’ve submitted a bug report to VTK library anyway: vtkWindowedSincPolyDataFilter crashes for large input data (#19346) · Issues · VTK / VTK · GitLab

Without smoothing the segmentation looks like this (really noisy):

After median filtering with 5mm kernel size:

This smoothed segmentation is more meaningful and the generated mesh is about 10x smaller (under 40 million points and cells). Since the image is so noisy, it may also make sense to apply some smoothing (e.g., Curvature anisotropic diffusion) before segmentation.

Yes, I should have mentioned that’s the option I used.

I’ve been playing around with this a lot and just wanted to say thanks again to you both. “Use Surface Nets” has worked exceptionally well for me, and the denoising filters also make a huge difference without a noticeable impact on quality. Between these options I haven’t had any more issues with crashing and the whole process has become substantially faster.

Thanks so much!