Guys … please suggest me steps to start with! Integration or creating API interface to work with Phantom Touch!
There are many solutions to this, depending on what you would like to use the haptic interface for.
If you want to control a robot or provide haptic feedback then you need a very fast control loop, so you would probably want to use ROS and just stream the controller’s pose to Slicer via OpenIGTLink. We have set this up recently using CISST-SAW’s Phantom interface and OpenIGTLink interface. Alternatively, there are many other ROS node for Phantom haptic devices and you can use ROS IGTL bridge for streaming transforms from ROS to Slicer.
If you just want to use the Phantom as a 3D mouse then you can probably find a Python package that you can pip-install into Slicer’s Python environment to receive the transform and set it into a transform node in Slicer.
You can also create a small C++ loadable module that uses OpenHaptics interface to communicate with the device.
What would you like to achieve?
Hi,
Thanks. Excellent feedback, I’m trying interact in Slicer for Phantom as 3D mouse.
Could you please guide me landing with a Python Package that can be pip install.
Thanks again.
Best wishes,
Ajit
If searching on Google and PyPI does not bring up anything usable then ask from 3DSystems Support (or community forum, if they have any). If they cannot help then you still have all the other options. Also, if you just need a 3D mouse (no force feedback) then you have much better options:
- you can get two full 6-DOF controllers (not just on 5-DOF controller) with lots of buttons and an immersive stereo display from a virtual reality headset
- you can just buy an optical tracker, such as OptiTrack Duo (for a fraction of the price, with a wireless tool)
and connect using SlicerIGT - if you don’t need high absolute accuracy (just accurate relative positioning) then you can use a single webcam and glue a 2D barcode on a pencil and use that as 3D mouse.