Unity tracking bridge client and ultrasound server... help needed

Hi all,

I am pretty new to slicer, actually I stumbled over it yesterday and I have a problem that could be easy to solve or it is just impossible :slight_smile:

I have some experience in spatial tracking using unity with all kinds of hardware and really like it. Recently I stumbled over a Telemed ultrasound device and the idea was born to use an object tracked in the unity coordinate frame and combine it with the tracked ultrasound surface reconstruction that is well implemented in Slicer.
So far I managed it to get the outdated Unity OpenIGTLink scripts working and, with a Server started in Slicer I get the transforms and rotations of my trackables displayed in the 3D view as expected.
On the other hand, if I start the Plus server and switch slicer in client mode, I get the ultrasound images of the Telemed device displayed in slicer.
Unfortunately I am no networking specialist and so far not as experienced with slicer to find a simple solution to get both features combined. Because the Unity script only listens to a server and sends some transformations back I have a Client / Server paradoxon and would need Slicer to work as Server for the trackables and as client for the US images simultaniously.

Maybe it is only some xml magic in the Plus server configs so that the server catches the transformations send from unity and parses it to a Slicer client together with the US image…but so far I could not manage it.
At least Unity does not crash when the plus server is launched so they are communicating.

Any help is much appreciated!

Best regards

Allan

Operating system: Windows 10
Slicer version: 5.0.3
Expected behavior: Working
Actual behavior: Not working

Do you have connected the unity with 3D Slicer elements?

There is a more recent Unity/Slicer integration project by @AliciaPose:

It allows sending/receiving transforms and sending an image stream from Slicer to Unity.

1 Like

That looks promising, I´ll give it a try. I also thought about using the already available Slicer VR-Pipeline to attach the ultrasound images to a vive tracker as a low-cost navigation solution which should also enable the ultrasound segmentation capability of slicer. I´ve done some experiments in the past to get rid of the headset requirement in steamVR so maybe I find the time to try both ideas. Thanks a lot for your response.