TotalSegmentator error at first run: ‘DummyFile’ object has no attribute ‘flush’

Hi Iassoan, I’m trying to use the TotalSegmentator Tool but i receive this error below the Apply button:

RuntimeError: FIND was unable to find an engine to execute this computation

Exception ignored in: <totalsegmentator.libs.DummyFile object at 0x000002765B9EE4F0>

AttributeError: ‘DummyFile’ object has no attribute ‘flush’

I checked if could be an error due to the GPU’s memory but on my laptop is 10Gb of Memory.
Thanks for you time
AZ

Could you provide some more details? What is the brand and model of the GPU?

Sure. The model is “GPU intel(r) uhd graphics 620”

Unfortunately, GPU processing will require an NVIDIA GPU, so you could switch to CPU processing which will take longer to run (30 minutes vs. 2 minutes).

Switch here:

image

I don’t know why but it doesn’t switch the computation on CPU, it still remain on GPU and ask me if i want to enable the ‘fast’ mode.

You need to uninstall the GPU version of pytorch, restart Slicer, and install the CPU version of pytorch - as described
here.

1 Like

Ok, now it is correctly working. Thank you so much for your time

1 Like

Hey @lassoan, I have uninstalled the GPU version of pytorch, restart Slicer, and install the CPU version of pytorch. However, still encountering the same error.

What is the installed pytorch version?

I found your other post here.

It’s been resolved. Thank you !

1 Like

I’m having the same problem,somebody can analysis for me?

The “DummyFile” object does not have a “flush” attribute
returned non-zero exit status 120.

I can not find the information about The “DummyFile” object does not have a “flush” attribute

the follow pytorch version detail :

error detail :

Writing input file to C:/Users/26686/AppData/Local/Temp/Slicer/__SlicerTemp__2024-06-21_17+33+47.251/total-segmentator-input.nii
Creating segmentations with TotalSegmentator AI (pre-run)...
Total Segmentator arguments: ['-i', 'C:/Users/26686/AppData/Local/Temp/Slicer/__SlicerTemp__2024-06-21_17+33+47.251/total-segmentator-input.nii', '-o', 'C:/Users/26686/AppData/Local/Temp/Slicer/__SlicerTemp__2024-06-21_17+33+47.251/segmentation', '--fast']
D:\Slicer 5.6.2\lib\Python\Lib\site-packages\requests\__init__.py:102: RequestsDependencyWarning: urllib3 (1.26.18) or chardet (5.2.0)/charset_normalizer (2.0.12) doesn't match a supported version!
  warnings.warn("urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported "
D:\Slicer 5.6.2\lib\Python\Lib\site-packages\requests\__init__.py:102: RequestsDependencyWarning: urllib3 (1.26.18) or chardet (5.2.0)/charset_normalizer (2.0.12) doesn't match a supported version!
  warnings.warn("urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported "
D:\Slicer 5.6.2\lib\Python\Lib\site-packages\requests\__init__.py:102: RequestsDependencyWarning: urllib3 (1.26.18) or chardet (5.2.0)/charset_normalizer (2.0.12) doesn't match a supported version!
  warnings.warn("urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported "
multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
  File "D:\Slicer 5.6.2\lib\Python\Lib\multiprocessing\pool.py", line 125, in worker
    result = (True, func(*args, **kwds))
  File "D:\Slicer 5.6.2\lib\Python\Lib\multiprocessing\pool.py", line 51, in starmapstar
    return list(itertools.starmap(args[0], args[1]))
  File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\nnunetv2\inference\export_prediction.py", line 39, in export_prediction_from_softmax
    segmentation = label_manager.convert_logits_to_segmentation(predicted_array_or_file)
  File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\nnunetv2\utilities\label_handling\label_handling.py", line 182, in convert_logits_to_segmentation
    return self.convert_probabilities_to_segmentation(probabilities)
  File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\nnunetv2\utilities\label_handling\label_handling.py", line 175, in convert_probabilities_to_segmentation
    segmentation = predicted_probabilities.argmax(0)
numpy.core._exceptions._ArrayMemoryError: Unable to allocate 6.85 GiB for an array with shape (287, 233, 233, 118) and data type float32
"""

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "D:\Slicer 5.6.2\lib\Python\Lib\runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "D:\Slicer 5.6.2\lib\Python\Lib\runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "D:\Slicer 5.6.2\lib\Python\Scripts\TotalSegmentator.exe\__main__.py", line 7, in <module>
  File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\totalsegmentator\bin\TotalSegmentator.py", line 127, in main
    totalsegmentator(args.input, args.output, args.ml, args.nr_thr_resamp, args.nr_thr_saving,
  File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\totalsegmentator\python_api.py", line 293, in totalsegmentator
    seg_img, ct_img = nnUNet_predict_image(input, output, task_id, model=model, folds=folds,
  File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\totalsegmentator\nnunet.py", line 395, in nnUNet_predict_image
    nnUNetv2_predict(tmp_dir, tmp_dir, task_id, model, folds, trainer, tta,
  File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\totalsegmentator\nnunet.py", line 178, in nnUNetv2_predict
    predict_from_raw_data(dir_in,
  File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\nnunetv2\inference\predict_from_raw_data.py", line 347, in predict_from_raw_data
    [i.get() for i in r]
  File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\nnunetv2\inference\predict_from_raw_data.py", line 347, in <listcomp>
    [i.get() for i in r]
  File "D:\Slicer 5.6.2\lib\Python\Lib\multiprocessing\pool.py", line 771, in get
    raise self._value
numpy.core._exceptions._ArrayMemoryError: Unable to allocate 6.85 GiB for an array with shape (287, 233, 233, 118) and data type float32
Exception ignored in: <totalsegmentator.libs.DummyFile object at 0x000001E2BB5F6B20>
AttributeError: 'DummyFile' object has no attribute 'flush'

If you use this tool please cite: https://pubs.rsna.org/doi/10.1148/ryai.230024

Using 'fast' option: resampling to lower resolution (3mm)
Resampling...
  Resampled in 4.25s
Predicting...
type or paste code here

AttributeError: 'DummyFile' object has no attribute 'flush' is a false alarm, you can ignore this message.

In general, you need to look for the first error in the output - all further errors may be consequences of the first one. In your case, the first error was that you have ran out of memory:

To solve the issue, you need to reduce the image size or increase available memory size. See detailed instructions here.

Many thanks for taking the time to respond,I try set up the bigger virtual memory to run again。