MONAI Label Server error

Dear all members in 3D-Slicer

I’m working auto segmentation with brats_mri_segmentation_v0.2.1 in 3D-Slicer.

When I conduct server start, I used command that is ‘monailabel start_server --app apps/monaibundle --studies datasets/Task01_BrainTumour/imagesTr --conf models brats_mri_segmentation_v0.2.1’.

I added code that is ’ “ensure_channel_first”: true ’ in “preprocessing” part in Inference.json of monaibundle.

But it occurs error that is ‘Failed to run Inference in MONAI Label Server’. Does it have solution?

Train.json also need to edit, but I don’t know where it adds code.

please, let me know how to solve error that is Failed to run interence in MONAI Label Server.

  • monai.config.print_debug_info() -
    Printing MONAI config…
    MONAI version: 1.0.1
    Numpy version: 1.23.4
    Pytorch version: 1.12.1+cu113
    MONAI flags: HAS_EXT = False, USE_COMPILED = False, USE_META_DICT = False
    MONAI rev id: 8271a193229fe4437026185e218d5b06f7c8ce69
    MONAI file: C:\Users\AA\AppData\Local\Programs\Python\Python39\lib\site-packages\

Optional dependencies:
Pytorch Ignite version: 0.4.10
Nibabel version: 4.0.2
scikit-image version: 0.19.3
Pillow version: 9.3.0
Tensorboard version: 2.10.1
gdown version: 4.5.3
TorchVision version: 0.13.1+cu113
tqdm version: 4.64.1
lmdb version: 1.3.0
psutil version: 5.9.4
pandas version: 1.5.1
einops version: 0.6.0
transformers version: NOT INSTALLED or UNKNOWN VERSION.
pynrrd version: 0.4.3

For details about installing the optional dependencies, please visit:
Installation Guide — MONAI 0 Documentation

Printing system config…

System: Windows
Win32 version: (‘10’, ‘10.0.22000’, ‘SP0’, ‘Multiprocessor Free’)
Win32 edition: Core
Platform: Windows-10-10.0.22000-SP0
Processor: AMD64 Family 25 Model 80 Stepping 0, AuthenticAMD
Machine: AMD64
Python version: 3.9.13
Process name: python.exe
Command: [‘C:\Users\AA\AppData\Local\Programs\Python\Python39\python.exe’, ‘-c’, ‘import monai; monai.config.print_debug_info()’]
Open files: [popenfile(path=‘C:\Program Files\WindowsApps\Microsoft.LanguageExperiencePackko-KR_22000.29.134.0_neutral__8wekyb3d8bbwe\Windows\System32\ko-KR\a7c1941e6709c10ab525083b61805316\KernelBase.dll.mui’, fd=-1), popenfile(path=‘C:\Program Files\WindowsApps\Microsoft.LanguageExperiencePackko-KR_22000.29.134.0_neutral__8wekyb3d8bbwe\Windows\System32\ko-KR\39386f74d1967f5c37a5b4171f81c8f3\kernel32.dll.mui’, fd=-1), popenfile(path=‘C:\Program Files\WindowsApps\Microsoft.LanguageExperiencePackko-KR_22000.29.134.0_neutral__8wekyb3d8bbwe\Windows\System32\ko-KR\fe441ef3ed396a241e46f9f354057863\tzres.dll.mui’, fd=-1)]
Num physical CPUs: 8
Num logical CPUs: 16
Num usable CPUs: 16
CPU usage (%): [10.8, 3.2, 13.1, 3.8, 7.6, 3.9, 4.5, 0.6, 3.9, 0.0, 1.9, 1.9, 8.3, 8.4, 4.5, 45.3]
CPU freq. (MHz): 2652
Load avg. in last 1, 5, 15 mins (%): [0.0, 0.0, 0.0]
Disk usage (%): 57.4
Avg. sensor temp. (Celsius): UNKNOWN for given OS
Total physical memory (GB): 15.4
Available memory (GB): 7.2
Used memory (GB): 8.2

Printing GPU config…

Num GPUs: 1
Has CUDA: True
CUDA version: 11.3
cuDNN enabled: True
cuDNN version: 8302
Current device: 0
Library compiled for CUDA architectures: [‘sm_37’, ‘sm_50’, ‘sm_60’, ‘sm_61’, ‘sm_70’, ‘sm_75’, ‘sm_80’, ‘sm_86’, ‘compute_37’]
GPU 0 Name: NVIDIA GeForce RTX 3070 Laptop GPU
GPU 0 Is integrated: False
GPU 0 Is multi GPU board: False
GPU 0 Multi processor count: 40
GPU 0 Total memory (GB): 8.0
GPU 0 CUDA capability (maj.min): 8.6

I completed to solve error that is ‘Failed to run Inference in MONAI Label Server’ occurred in MONAI Label server.

This is being discussed here: