3D slicer bogs down and crashes when upscaling segmentation geometry

Hi all,

I’m new to 3D Slicer, I’m trying to create a high fidelity model of my own post-craniotomy skull and currently in the process of trying to eliminate “stepping” on the model by trying to use the segmentation geometry button to oversample to a point where the stepping is not significant. However, every time I approach a point around spacing of .1, Slicer bogs down and crashes on any edits to the segmentation. No crash warning, no window, just one minute Slicer is there and the next it isn’t.

I am running an i9 3.7gHz 10 core CPU, 32gb of RAM and a further 80 virtual RAM on my SSD, am I drawing up too much in the way of resources to achieve this on my machine, or is there another way to accomplish what I’m looking for?

I’ve attached a screenshot of the current very rough segmentation, I seem to be able to threshold it at this level but if I try to remove small islands or manually remove anything it spends a few minutes cranking away and then crashes.

If your original spacing is say 0.5mm, and you are changing this to 0.1mm, that’s a 5x5x5 = 125 fold increase in the data volume. Yes, probably slicer is crashing out of memory error.

You can’t really improve the staircase effects too much with oversampling, if your original data is of lower resolution. Oversampling will simply subdivide the voxels to smaller chunks. You should use smoothing (both at the labelmap and the 3D model level) to reduce the staircase effect. Where oversampling might be be helpful, is when you have thin bones (such as the orbital wall) and when you run the smoothing you might find the hole got bigger, if you do not oversample. But even than just use an oversampling of 2 or maybe 3, keeping in mind that the size of your data will increase by the cube of the oversampling factor you are using.

I’d love if I could get to 2 or 3, unfortunately it seems like 1.75 is about where I hit a wall. I’m able to get some good detail at this point, however I do still lose quite a bit of orbital wall even before applying smoothing. Is this something that access to more capable hardware might solve, or should I call this stage as about what I can expect to get with the data I have and farm it out to someone to touch up the model in Blender or whatever other tool?

This is probably as good as you can expect from this data. There are some deep-learning supersampling techniques that might help with the orbital wall (you apply to the original volume), but I don’t know much about them, beyond that they exist…

The crashes are almost certainly due to memory issues. While your machine sounds good for general purpose work, you can easily rent machines with more RAM from any of the cloud providers or get access for free if you have an academic affiliation.

If your goal is 3D printing, then upsampling should help you preserve details and smoothness that would otherwise be lost during segmentation.

If your goal is mainly to visualize the data, then using volume rendering is often a more effective solution. You may also want to combine segmentation and volume rendering using the Colorize Volume module in the Sandbox extension.