I have been saving files as mrb files. I realized today, while going through my registrations, that the full body scans that had previously been used for segmentation (which were all different), had somehow gotten reduced to copies of the second full body volume. I have no idea how this happened as until a couple weeks ago there was no issue. I have the original files that I had segmented, but when I pulled them all into the same scene and subsequently started registration all of the volumes except the volume I registered to have turned into duplicates of the first moving volume. Here is an example:
When a volume node contains an empty image, then when you select it in the slice view, the previously selected volume may remain selected. Can you confirm that in your case the volumes that appear to be the same as the “first moving volume” are actually valid? You can verify that by zooming in/out (if you don’t have a valid image then you cannot zoom in/out) and showing only that volume as background volume (select None as foreground volume).
Can you reproduce the issue?
Do you have scenes that you can load individually and they appear correctly but when you load them all into one scene they show up differently?
Can you share those scenes so that we can investigate?
Do you get any warnings or errors when you load the scene?
I can reproduce the issue. The scenes within the compression fo the mrb files that I had with the registrations all load incorrectly. Is there a file size limitation that can cause issues like this as I just created a separate mrb file with specific segmentations and volumes (shortening the amount significantly), and it seems to load fine thus far and appears to at least register them fine? Haven’t tried saving them and reloading them yet.
If you run out of memory then anything can happen. How much RAM do you have? How much virtual memory have you configured for your system? What is the size of all data sets in the scene (uncompressed)? Is there any error logged during scene loading? Can you share the data sets?
So I’m working on a desktop with 32 GBs of RAM. The virtual memory configuration is the default configuration when downloaded as I have no knowledge of how to adjust it or anything. Each scene is approximately 680 MBs I believe uncompressed as the mrb files saved for each individual are about 300,000-350,000 KBs. There was no error logged when I loaded them. Unfortunately, the school is working on being able to share our data, but currently, we are not allowed as it is all from our human gift program and the families need to sign off.
To investigate or fix the problem, it is essential to be able to reproduce it. If you cannot share the images, then blank out the images with some simple shapes (you can do that using Segment Editor / Mask volume; after installing SegmentEditorExtraEffects extension) save the scene, and see if you can reproduce the problem using these modified images. Or save the images in nhdr format and then delete the pixel data (.raw file) and send all the other files.
I’m sorry, I’m not fully understanding what you are asking me to do. Are you asking me to decompress the mrb files that aren’t loading properly, and use those with the segment editor operation and send the scenes? Or are you asking me to just sent the scenes and such of the files individually that I pull into Slicer?
I would need the scene files to reproduce the problem. You would not like to share the data because the image data, but actually I don’t need voxel data, an image filled with some solid color is just as good.
So, I suggest to blank out the voxel data in your images so that you can share the scenes.
Once you have blanked out voxel data in your images, save the scene, and share with us (it should be no problem sharing them, as there original image data is not there anymore).
I think I figured out the problem anyways. My CPU and Memory were being maxed out and my storage was 2 GBs shy of being full. In fact, when I tried running it last the whole computer overloaded and shut down.
Thanks for the update. In general, most software act randomly (crash, report various errors, etc) if they run out of memory. As a rule of thumb, allocate at least 10x more virtual memory than the sum of size of all data that you load (backed up by as much physical memory as possible).