Configurating Robot arm (Panda - Franka Emika) In slicer

I went through the ROSMED tutorials where a robot arm is loaded to perform a needle insertion, my goal is to recreate this action but with another robot arm model, from what i can see in the tutorial, i would need to get the specific stl files from all the robot links AND the transformation matrices (h5 files in the tutorial) and integrate them into a .mrm scene lto be able to visualize and move the robot around in 3D slicer, i have mainly two questions:

  1. if i just load the stl files and issue commands via IGTLink for an arbitrarily planned path in rviz, then i would be able to visualize the correct movement in 3D slicer even if i did not construct the robot using a hierarchy of transforms in slicer, right? i would think that would be the case since rviz already has this information and 3D slicer would just work to visualize the movements that arise no issues in rviz-move it motion plannin

in case thats not the case, is there any information on how to proceed in “assembling” the robot arm in 3D slicer? from the ROSMED tutorial a .mrlm scene is loaded so im wondering whats the required process for creating that scene with the mentioned robot arm, at the moment i just have the 3D model of the robot arm so maybe i would need to calculate the transformation matricesof the links but im unsure how

thanks for your support, i really appreciate it

All of what you mentioned is possible and it is your decision how you want things to work. What would you send to Slicer from ROS?

  • OpenIGTLink transform containing absolute position+orientation of each part of the robot arm (in Slicer there is no need for creating any transform tree, just apply the transform directly to the model nodes)
  • OpenIGTLink transform containing relative translation+rotation between joints (in Slicer you need to arrange the received transforms in a tree by drag-and-drop in the transform hierarchy)
  • OpenIGTLink transform containing relative rotation between joints (in Slicer you need to arrange the received transforms in a tree and add relative translations between them)
  • OpenIGTLink ndarray containing joint angles (in Slicer you need to add a small python script that computes relative transforms from the joint angles)

Hello Mr Laso, Thanks for your kind reply, in my case i would be sending information about the robot initial -> target pose via MoveIt! (same as described in the RosMed tutorial) therefore i think the information sent would be the 1) absolute position+orientation of each joint (although i might be wrong, but based on MoveIt! RobotState Class i would assume this would be the case)

my current goal is to be able to visualize the same needle insertion technique from the RosMed tutorial but using a different robot arm, that is to be able to visualize robot arm movements/needle paths in slicer, for this i have to assemble my own scene (with a patient model and the mentioned robot arm) but im unsure about how to load the robot parts in slicer as .stl files in slicer, i will try to assemble the robot arm in slicer as recommended in other posts, that is creating a transform hierarchy, to then create my own robot arm + patient scene and try to recreate the tutorial steps,

i apologize if my question is to naive or vague, im fairly new to this incredible tool, and doing my best to understand it properly

To visualize the robot in Slicer, you need to have information about position+orientation of each robot part. You send this information from ROS to Slicer. What parameters do you send? Absolute position+orientation, relative transforms, … (see the list above)?

I think i understand it now, after analyzing the robot in the ROSMED slicer scene and the structure of the project in general, The Position+Orientation of the robot (Homogeneous matrix containing rotational+translational part of each link) is sent with the igtl_exporter.py via the IGTL_Bridge, therefore i would need to modify this script (igtl_exporter.py) according to my robot characteristics (links). That means i can “assemble” the robot in slicer applying a trasform directly in the model nodes as you suggest in your first option of your main reply, and send the position+orientation of each link via ROS whenever i want to achieve a new robot pose

Thank you very much for pointing me in the right direction

1 Like