Loading nifti images into slicer in real time

I am new to slicer and am just getting familiar with this powerful software. A big thanks to all who developed this and continue to support it.

I am working on a problem where I need to visualize images in slicer in ‘real time’ ie as they are coming in from a scanner. The images are being dropped into a local folder as nifti files. I need to load the newest image automatically into slicer as it comes in. Is there a way to do this ?

Thanks in advance.

You would normally use OpenIGTLink protocol for real-time image streaming from scanners. What software do you use for image acquisition, from what hardware?

Uncompressed nifti file reading could be quite fast, too, but polling the file system for changes might be difficult at high frame rates.

OpenIGTLink would be a good options, but depending on your OS a file system watcher can be efficient.


Another option is SlicerWeb where there is a rest endpoint for POSTing a nrrd file directly into a volume node, and that could be extended to handle nii as well.

1 Like

Thanks for your response, Andras. We are using a Philips MRI scanner and use Philips XTC protocol to get images in real time. Frame rates would be around ~1 3D volume/5 sec or 1 2D image/sec at the most.

Thanks for the suggestions , Steve. It’ll definitely take me some time to explore these options :slight_smile:

This is not a very demanding update rate, so using the file system for data transfer would most likely not lead to significant overhead.

However, OpenIGTLink was been developed exactly for this application (MRI image acquisition and scan plane control for Siemens MRI scanners for robot-assisted interventions) and has several advantages over a basic file-based communication, including:

  • Better performance: File reading would happen in the main thread, blocking the whole application for short periods of time, which can be quite annoying for users during continuous acquisition. In contrast, OpenIGTLink continuously reads data from the network in the background, without blocking the application GUI at all (and takes care of all the necessary synchronization between the main thread and background threads).
  • Two-way communication: In addition to images (either 2D or 3D), OpenIGTLink also allows you to control scan plane (by sending transforms), start/stop acquisition, switch between imaging modes (using commands), etc.
  • Simpler implementation: You can implement Philips XTC/OpenIGTLInk bridge in a few dozen lines of Python code, by adding a small OpenIGTLink frontend to matMRI using pyigtl. You don’t need to worry about multithreading, synchronization, debugging random delays in file systems, etc.
  • Large ecosystem: There are many tools for real-time image-guided interventional applications based on OpenIGTLink (see for example SlicerIGT and PLUS toolkit). You can record, replay, mix, calibration, synchronize, simulate, broadcast OpenIGTLink data streams. There are many imaging and position tracking tools, various sensors, robotic positioning devices, etc. using OpenIGTLink, so it is easy to integrate all these devices into a system and it is easy to replace any component by another one (e.g., in the lab you can simulate some components just by changing configuration files, without changing any code in your application). You can easily distribute the processing work across several computers if needed and we have examples of how to apply real-time AI processing to image streams.

Thank you very much Andras (and I apologize for my late response).
Yes, we are using MatMRI (thanks Sam Pichardo) , so as you suggest , integrating OpenIGTLink with that would be the most powerful solution.

1 Like