Issue with HardenTransform

I’m writing a script and trying to apply HardenTransform on a segmentation node as follows.


This works fine for most samples but sometimes the script crashes (triggered at HardenTransform) and yields the following error:

Unable to allocate 227735193073740 elements of size 4 bytes. 

"Slicer has caught an application error, please save your work and restart.\n\nThe application has run out of memory. Increasing swap size in system settings or adding more RAM may fix this issue.\n\nIf you have a repeatable sequence of steps that causes this message, please report the issue following instructions available at\n\n\nThe message detail is:\n\nException thrown in event: std::bad_alloc"

transformNode has slizeSizeMM = [float(15.0), float(15.0)] but when changing to e.g. slizeSizeMM = [float(13.0), float(13.0)] the script works fine on the samples triggering this error.

From what I’ve read earlier there are some issues related to HardenTransform on large segmentations (see Harden transform on large segmentation hangs 3D Slicer). In my case, the underlying segmentations seem to be “longer” than normal relative to other samples.

Is there any effective more general solution to this issue?

OS: Ubuntu 20.04
Slicer3D version: 5.2.2


Typo, should be:


If you specify a transform that would make the resulting segmentation so big that it cannot fit into memory then the operation is expected to fail. We can tell if it is a software issue or just incorrect input if you provide a sample data set and processing script that reproduces the problem.

Ok, that makes sense. Unfortunately, I cannot provide you with sample data. Is there any way to catch this error instead of having the script crash?


You can try to reproduce it with data sets in Slicer"s Sample Data module or other public data sets.