While working on some potential clinical use cases, we (with @lassoan) made a video about collaborative planning in VR, in which two Slicer instances using the same scene are connected through OpenIGTLink, so we can see each other in VR. We thought you’ll find this interesting.
@pieper I like the new tag you created
This is great guys! We’ve been trying to use your process in 3d slicer for VR but its been a big learning curve. This helps.
It looks great! Is there any tutorial for this (about linking)
There is no tutorial, because the process is not polished at all (it was quite lengthy to set up), and this was just an experiment.
We’re planning to develop a module in the near future that helps setting up the Slicer instances for VR collaboration. Probably we’ll make a tutorial for the module once it’s done.
Thanks for getting back.
One more question. I am working with windows mixed reality headset. I am planning to invest in a new headset. I read in one of the discussions somewhere that HTC vive is good one. Is there any specific thing that can be done by HTC vive regarding virtual display. Also any suggestions on the model of HTC vive to be used for use with SLicer.
The main difference of HTC Vive compared to Windows MR headsets is that HTC Vive uses fixed lighthouse devices, which make tracking initialization more robust (you don’t need to look around after you put on the headset but you can start tracking right away). So, if you have a dedicated place where you use your virtual reality system then I would recommend HTC Vive; if you want to have a portable system then I would recommend any Windows MR headset.
If you use a laptop then probably go with the classic HTC Vive (it is hard to find laptops that can drive an HTC Vive Pro), if you have a desktop system with one of the newest GPUs then you can use an HTC Vive Pro (it has somewhat better specs than the basic HTC Vive).
Thanks for clarifying Andras. I guess I have to stick to windows headset as I don’t have a dedicated place.
I think the setup is quite painless, especially if you just modify an existing scene that has the controller transforms sharing already configured. The setup could be also automated with a short Python script.
The main idea is that you run two instances of Slicer, each of them connected to a headset. Both instances contain the same data nodes (models, volumes, etc.) and their parent transforms are set as outgoing nodes in OpenIGTLinkIF module (which ensures the transform nodes are synchronized between the instances). The scene that we used in the video is available here.
I haven’t worked with Open IGTLimk module. So have to see if I can recreate the scene. I will post if I am able to achieve similar results.
Hi sir , I’m working with unity , is their any chance to exchange developments so we all reach to the best of it? i need MRI scan from hospitals so i do the experience. my email firstname.lastname@example.org