Estimating memory required for NNinteractive

Is there a way to predictably calculate the memory required to segment a volume using NNinteractive? For example given the 400x400x400 16 bit volume how much GPU memory would I need. I

Trying to figure out via trial and experiment is a fairly frustrating experience, as you have to wait until it fails…

I was running with nvtop installed and seeing how high the bar gets for different scenarios. This could give you a feel for the size vs memory tradeoffs but I’m not sure if there are other factors involved (like the autozoom).

I do that but I was hoping some sort of a deterministic method I can use to decide what’s the largest volume chunk I can fit in, in one go. Our segmentations are too big as is, so I am trying to figure out how to crop them into managable regions (4 or 8 or 16 ROIs, the fewer the better from data management purpose).