Using Plus Server to allow live streaming in 3D sllicer

We’re trying to get live streaming in 3D slicer using Plus Server but we’re having a bit of trouble. We tried to follow the instructions under Show live images in 3D Slicer on this link: http://perk-software.cs.queensu.ca/plus/doc/nightly/user/ProcedureStreamingToSlicer.html but were not able to get live streaming. Essentially, we launch Slicer without anything else, load an Image Reference mentioned in the link above and then follow the steps.

We’re currently using the Plus Server Launcher and in the instructions it refers to an Image Reference. We used the Image_Image.nrrdfile. We’re not sure if we’re missing a step or doing something incorrectly?

That document is a bit out of date.
In Slicer 4.10+, you need to download the SlicerOpenIGTLink extension to get access to OpenIGTLinkIF.

What config file are you using in Plus?

We are using the OpenIGTLinkIF module in order to set up our connection however, the steps we followed in how we used it were based on the link above.

The config file we’re using within Plus Server is the following xml: PlusDeviceSet_Server_OpticalMarkerTracker_Mmf. Should we be using another one?

With that config file, you need to create two connectors.

  1. Port 18944 – Transmits transforms for Optical markers
  2. Port 18945 – Transmits webcam as image called “Image_Image”

If you encounter other issues with Plus, could you submit an issue on the Plus github page and attach the Plus log file?

Hi, Sunderlandkyl!
I always tried the function of “PlusServer: Video for Windows video capture device”. But when I connect the US device, I found the directions of that figure on 3D Slicer and on US device are upside-down. For the better presentation of US imaging data, I tried the “Transform” on Slicer, but it didn’t work :smiling_face_with_tear:
So, could u offer me some solutions?

You can update “PortUsImageOrientation” to quickly flip the image.
See possible orientations here: Plus applications user manual: Ultrasound image orientation

Thanks, Kyle! And sorry for the late reply.
But actually, I only have the Cannon US device (which was not included in the list of supported devices). So, I just assumed the US probe as the video capture tool for testing the function of "PlusServer: Video for Windows video capture device”. Then, the images showed upside-down.
Then, I also tried the method you told me, the fCal. It doesn’t work. But I found the “Volume Reslice Driver” can flip the image, which may not show the actual orientation ( I guess?).

Sorry, one more question. The most attractive points of PLUS toolkit are the function of volume reconstruction and the flexible application of different devices. And now we have the Cannon US device and the robotic arm. Both of them were not included in the supported devices. I was wondering if that is possible to utilize PLUS toolkit to help me to realize the volume reconstruction in real-time under the assistance of robotic arm. Thank you for your time and consideration again!

Are you tracking the ultrasound or sending the transforms from the robotic arm to Plus or Slicer? If the ultrasound is tracked in some way, then you should be able to do live volume reconstruction with the current setup.

Sorry for the late reply. Actually, we just bond the US probe and robotic arm together, which means the spatial data of US probe can be obtained by the robotic arm.
And, I will try to follow the setup to do the live volume reconstruction. Thank you so much~~~