Loading in huge data

So one man from here called Alex_Vergara helped me process huge batches of .png images.

He made an incredibly well script module but I have some questions concerning Slicer in general.

I noticed once a batch of PNG images is being loaded in, it is not the disk speed that matters. I got NVMe from a friend Samsung 980 PRO PCIe 4.0 (up to 7,000 MB/s)

7000 MB/s and our boss upgraded machines to over 60GB of ram.
because what he had me working with some months ago was idk… very horrible.

With this new hardware NVMe I supposed loading in should be faster. But it isn’t. Or at least I don’t notice any change.

The speed is supposably more than 20 times faster than the old SSD.

Well… Why?

I assume there is somet recalculation that the CPU has to do and so disk speed isn’t even important that much because bottle neck is made by the CPU processing speed?

1 Like

The script code maybe edit by python which will be blocked by the GIL:https://realpython.com/python-gil/

So the speed will not improve significantly after the update of computer if you donnot recode the script.

Because reading and writing thousands of files individually has a lot of overhead. See this

incredible post, I became interested because yesterday they gave me over 60 GB of png sequence.

with 60gb ram and 1000 GB NVMe available for virtual ram memory and storing images,

it worked fine and everything is okay.