I am trying to create slices using orientation of the HTC Vive HMD/controller loation for some custom visualization. Are the position/orientation of the HTC Vive hardware available through some python code using the SlicerIGT or SlicerVirtualReality extensions?
Make position of controllers available as transforms in the Slicer scene. These transforms can be used in custom modules to reslice volumes (using Volume Reslice Driver module in SlicerIGT extension) or transform any nodes in the scene.
But I don’t seem to be able to find any way to access it.
Controller poses could be made available in physical coordinate system, too. Best option depends on what you would use that information and how, so if you need this then please provide more details about what you would like to do.
I am sorry, I do not have much of an experience in working with C++ variables exposed through Python, so I basically wanted to understand how to go about viewing the Controller Pose in Python. After my query, I took the bit of code “vrView=getNode(‘VirtualRealityView’)” on the page https://github.com/KitwareMedical/SlicerVirtualReality/blob/master/DeveloperGuide.md as my starting point and realized that I need to fetch the ‘node’ for one of the controllers (left in my case). I will post the process I followed, in case someone is stuck like I was.
First thing I tried was printing the vrView variable above. It contained a node called
I used the above syntax to get “LCon= getNode(‘vtkMRMLLinearTransformNodeVirtualReality.LeftController’)”
Printing the above LCon variable gave me the first glimpse of the transformation matrix for the Left Controller. Then I was again stuck at extracting that matrix for a very long time when dir(LCon) came to my rescue and I was able to extract the matrix part using:
This TMatrix is again an object whose individual elements have to be accessed by:
Where x,y are the row and column numbers. I am still not able to get the whole 4x4 matrix in a single go.
Phew! Now, my question is, is there a shorter way to go about it?
To show a custom model or markup at the controller’s position you don’t even need to do any programming. You can apply a transform to any transformable node by drag-and-dropping the node under the transform node in Data module / Transform hierarchy (or use Transforms module).
I beg your pardon, I should have clarified before. I am not trying to show a custom model or creating a markup at the controller’s position. In fact, I am not trying to display anything in VR at all.
Let me try to explain myself with an analogy: I am just trying to make use of the controller as a sword to slice (in this case, horizontal to the Pose of the controller) the 3D model and display the slice, continually updating as the location and the orientation of the controller changes.
Please clarify another doubt that I have: is the transformation matrix obtained above is the same as a Quaternion? I have always been running away from understanding how Quaternions work. But if this requires it, I will give it another shot.
I confirm that you can do the above by using Volume reslice driver module, without writing a single line of code. In fact, this was one of the demos that we showed last week at RSNA - you can download the Slicer scene from here. Just load the scene, enable virtual reality view, and move your hand into the volume to see the slices. You can switch to a one-up slice view to see the slice full-screen on your tablet.
Yes, I think so. We can only help if you describe exactly what you did, what you expected to happen, and what happened instead.