Hello Mr Laso, Thanks for your kind reply, in my case i would be sending information about the robot initial -> target pose via MoveIt! (same as described in the RosMed tutorial) therefore i think the information sent would be the 1) absolute position+orientation of each joint (although i might be wrong, but based on MoveIt! RobotState Class i would assume this would be the case)
my current goal is to be able to visualize the same needle insertion technique from the RosMed tutorial but using a different robot arm, that is to be able to visualize robot arm movements/needle paths in slicer, for this i have to assemble my own scene (with a patient model and the mentioned robot arm) but im unsure about how to load the robot parts in slicer as .stl files in slicer, i will try to assemble the robot arm in slicer as recommended in other posts, that is creating a transform hierarchy, to then create my own robot arm + patient scene and try to recreate the tutorial steps,
i apologize if my question is to naive or vague, im fairly new to this incredible tool, and doing my best to understand it properly