I’m having the same problem,somebody can analysis for me?
The “DummyFile” object does not have a “flush” attribute
returned non-zero exit status 120.
I can not find the information about The “DummyFile” object does not have a “flush” attribute
the follow pytorch version detail :
error detail :
Writing input file to C:/Users/26686/AppData/Local/Temp/Slicer/__SlicerTemp__2024-06-21_17+33+47.251/total-segmentator-input.nii
Creating segmentations with TotalSegmentator AI (pre-run)...
Total Segmentator arguments: ['-i', 'C:/Users/26686/AppData/Local/Temp/Slicer/__SlicerTemp__2024-06-21_17+33+47.251/total-segmentator-input.nii', '-o', 'C:/Users/26686/AppData/Local/Temp/Slicer/__SlicerTemp__2024-06-21_17+33+47.251/segmentation', '--fast']
D:\Slicer 5.6.2\lib\Python\Lib\site-packages\requests\__init__.py:102: RequestsDependencyWarning: urllib3 (1.26.18) or chardet (5.2.0)/charset_normalizer (2.0.12) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported "
D:\Slicer 5.6.2\lib\Python\Lib\site-packages\requests\__init__.py:102: RequestsDependencyWarning: urllib3 (1.26.18) or chardet (5.2.0)/charset_normalizer (2.0.12) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported "
D:\Slicer 5.6.2\lib\Python\Lib\site-packages\requests\__init__.py:102: RequestsDependencyWarning: urllib3 (1.26.18) or chardet (5.2.0)/charset_normalizer (2.0.12) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported "
multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
File "D:\Slicer 5.6.2\lib\Python\Lib\multiprocessing\pool.py", line 125, in worker
result = (True, func(*args, **kwds))
File "D:\Slicer 5.6.2\lib\Python\Lib\multiprocessing\pool.py", line 51, in starmapstar
return list(itertools.starmap(args[0], args[1]))
File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\nnunetv2\inference\export_prediction.py", line 39, in export_prediction_from_softmax
segmentation = label_manager.convert_logits_to_segmentation(predicted_array_or_file)
File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\nnunetv2\utilities\label_handling\label_handling.py", line 182, in convert_logits_to_segmentation
return self.convert_probabilities_to_segmentation(probabilities)
File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\nnunetv2\utilities\label_handling\label_handling.py", line 175, in convert_probabilities_to_segmentation
segmentation = predicted_probabilities.argmax(0)
numpy.core._exceptions._ArrayMemoryError: Unable to allocate 6.85 GiB for an array with shape (287, 233, 233, 118) and data type float32
"""
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "D:\Slicer 5.6.2\lib\Python\Lib\runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "D:\Slicer 5.6.2\lib\Python\Lib\runpy.py", line 87, in _run_code
exec(code, run_globals)
File "D:\Slicer 5.6.2\lib\Python\Scripts\TotalSegmentator.exe\__main__.py", line 7, in <module>
File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\totalsegmentator\bin\TotalSegmentator.py", line 127, in main
totalsegmentator(args.input, args.output, args.ml, args.nr_thr_resamp, args.nr_thr_saving,
File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\totalsegmentator\python_api.py", line 293, in totalsegmentator
seg_img, ct_img = nnUNet_predict_image(input, output, task_id, model=model, folds=folds,
File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\totalsegmentator\nnunet.py", line 395, in nnUNet_predict_image
nnUNetv2_predict(tmp_dir, tmp_dir, task_id, model, folds, trainer, tta,
File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\totalsegmentator\nnunet.py", line 178, in nnUNetv2_predict
predict_from_raw_data(dir_in,
File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\nnunetv2\inference\predict_from_raw_data.py", line 347, in predict_from_raw_data
[i.get() for i in r]
File "D:\Slicer 5.6.2\lib\Python\Lib\site-packages\nnunetv2\inference\predict_from_raw_data.py", line 347, in <listcomp>
[i.get() for i in r]
File "D:\Slicer 5.6.2\lib\Python\Lib\multiprocessing\pool.py", line 771, in get
raise self._value
numpy.core._exceptions._ArrayMemoryError: Unable to allocate 6.85 GiB for an array with shape (287, 233, 233, 118) and data type float32
Exception ignored in: <totalsegmentator.libs.DummyFile object at 0x000001E2BB5F6B20>
AttributeError: 'DummyFile' object has no attribute 'flush'
If you use this tool please cite: https://pubs.rsna.org/doi/10.1148/ryai.230024
Using 'fast' option: resampling to lower resolution (3mm)
Resampling...
Resampled in 4.25s
Predicting...
type or paste code here