I went through the ROSMED tutorials where a robot arm is loaded to perform a needle insertion, my goal is to recreate this action but with another robot arm model, from what i can see in the tutorial, i would need to get the specific stl files from all the robot links AND the transformation matrices (h5 files in the tutorial) and integrate them into a .mrm scene lto be able to visualize and move the robot around in 3D slicer, i have mainly two questions:
if i just load the stl files and issue commands via IGTLink for an arbitrarily planned path in rviz, then i would be able to visualize the correct movement in 3D slicer even if i did not construct the robot using a hierarchy of transforms in slicer, right? i would think that would be the case since rviz already has this information and 3D slicer would just work to visualize the movements that arise no issues in rviz-move it motion plannin
in case thats not the case, is there any information on how to proceed in āassemblingā the robot arm in 3D slicer? from the ROSMED tutorial a .mrlm scene is loaded so im wondering whats the required process for creating that scene with the mentioned robot arm, at the moment i just have the 3D model of the robot arm so maybe i would need to calculate the transformation matricesof the links but im unsure how
All of what you mentioned is possible and it is your decision how you want things to work. What would you send to Slicer from ROS?
OpenIGTLink transform containing absolute position+orientation of each part of the robot arm (in Slicer there is no need for creating any transform tree, just apply the transform directly to the model nodes)
OpenIGTLink transform containing relative translation+rotation between joints (in Slicer you need to arrange the received transforms in a tree by drag-and-drop in the transform hierarchy)
OpenIGTLink transform containing relative rotation between joints (in Slicer you need to arrange the received transforms in a tree and add relative translations between them)
OpenIGTLink ndarray containing joint angles (in Slicer you need to add a small python script that computes relative transforms from the joint angles)
Hello Mr Laso, Thanks for your kind reply, in my case i would be sending information about the robot initial -> target pose via MoveIt! (same as described in the RosMed tutorial) therefore i think the information sent would be the 1) absolute position+orientation of each joint (although i might be wrong, but based on MoveIt! RobotState Class i would assume this would be the case)
my current goal is to be able to visualize the same needle insertion technique from the RosMed tutorial but using a different robot arm, that is to be able to visualize robot arm movements/needle paths in slicer, for this i have to assemble my own scene (with a patient model and the mentioned robot arm) but im unsure about how to load the robot parts in slicer as .stl files in slicer, i will try to assemble the robot arm in slicer as recommended in other posts, that is creating a transform hierarchy, to then create my own robot arm + patient scene and try to recreate the tutorial steps,
i apologize if my question is to naive or vague, im fairly new to this incredible tool, and doing my best to understand it properly
To visualize the robot in Slicer, you need to have information about position+orientation of each robot part. You send this information from ROS to Slicer. What parameters do you send? Absolute position+orientation, relative transforms, ā¦ (see the list above)?
I think i understand it now, after analyzing the robot in the ROSMED slicer scene and the structure of the project in general, The Position+Orientation of the robot (Homogeneous matrix containing rotational+translational part of each link) is sent with the igtl_exporter.py via the IGTL_Bridge, therefore i would need to modify this script (igtl_exporter.py) according to my robot characteristics (links). That means i can āassembleā the robot in slicer applying a trasform directly in the model nodes as you suggest in your first option of your main reply, and send the position+orientation of each link via ROS whenever i want to achieve a new robot pose
Thank you very much for pointing me in the right direction
Did you manage to get the integration of your own robot with the ROSMED tutorials working? If so, could you provide me with some guidance on the steps you took to get it working.
I was able to replace the UR5 robot in the ROSMED tutorials for the Franka-Emika Panda robot,
what you need to do first is to setup your robot in moveit, many robots models are available for motion planning in move it, the one i used was luckily one of them, but if you are using one not available, they provide guidance on how to configure it,
then you need to copy the igtl exporter py file from the ROSMED tutorial package, to your robot catkin_ws package (you can also create your own script), you need to modify this file according to your specific robot link names, if you want to visualize your robot in slicer you can import the robot stl model and assign the transforms that are communicated to slicer via this igtl_exporter python script, you apply this transforms directly to the individual robot links accordingly, if you dont have the stl model you can also visualize the transforms in slicer wich provide insight of the robot links positon+orientation
once this is covered you can visualize in slicer, robot trajectories planed with moveit, this is the stage of where im currently at in my project, still need to work on the segmentation, needle path planning, dont hesitate to ask if you have further questions
Iāve follow your steps but still have not managed to get it working, but I do think Iām on the right track. Iām just going to list the steps that I took on my behalf to give you a better understanding of the situation. This has all been done in my own workspace.
I have copied over the ros_igtl_bridge package from ISMR19 Workshop | ROS for Medical Robotics and running the following command
ācatkin_make --cmake-args -DOpenIGTLink_DIR:PATH=~/igtl/OpenIGTLink-buildā to build the packages for ros_igtl_bridge
I created a package transform_exporter in my workspace (following the same build dependencies as that of the ismr19_control package from the ROSMED tutorial) and created a āscriptsā folder in which i copied over the igtl_exporter.py script into.
In 3D slicer, I have obtained my robot link stlās (converting them from units metres to mm for 3D slicer) and have applied LinearTransformations to each individual link (translating them using the translation sliders to set it up) image seen below
I have modified the igtl_exporter.py script to suit the links of my robot. The ABB IRB 120 robot has link_1, link_2, link_3, link_4, link_5, link_6 and tool0/flange. see image below
To test my progress I ran the ros_igtl_bridge node (bridge.launch) and the igtl_exporter node. I then opened up 3D slicer and made sure that the connection status is ON for the OpenIGTLinkIF (port 18944) and then I proceeded to load up my ABB IRB 120 robot in Moveit and perform a plan and execute it.(see image below).
I believe Iāve followed your steps, but maybe not to the exact detail (possibly elaborate on the exact steps you took) which is probably why my robot doesnāt work as intended.
Could it possibly be how I set up my linear transformations nodes in 3D slicer (I set mine up using the transformations module and manually translated each link into position to set up the robot)? or how Iāve copied over the ros_igtl_bridge package or igtl_exporter.py script (maybe I didnāt set up my transform_exporter package correctly) or is there something else on the ROS side of things that I need to edit/modify? Even though I got the moveit! packages from the github depository, maybe the planning group or ros controller isnāt set up properly?
Iām greatly appreciative for the guidance you have provided me thus far, and will be happy to hear from your response. Iād love to hear in detail how you set your own robot up to get it working with the ROSMED project, maybe I can mimic the exact steps you took to get mine up and running.
After a bit of tinkering I got it working !!! Iād like to thank you so much for your help. The next step for me is the same for you, getting the needle pathing and planning set up. I think that would just be adding an extra link to the end of the robot (end-effector) and setting it up the transformation for the link on 3D slicer, and adding the end-effector link to the existing moveit! package.
@Cesar_Puga@CHRIS_HUYNH Could you post a nice screenshot (and acknowledgment text) of what you achieved that we could showcase it the image carousel in Slicerās upcoming redesigned website? Thank you!
The shared document for the integration is in the above link. The mp4 file may be a bit laggy because having multiple programs open and nodes running in the background uses a lot of my laptop power.
I would like to acknowledge Junichi Tokuda for providing the project of the ROS and 3D Slicer Integration, along with all the associated files and scripts, Andras Lasso for being there to provide support for setting up my ABB IRB 120 robot in 3D slicer and applying transformations to each link of my robot in the 3d slicer software, and Cesar Puga for providing the guidance on how to set up my own ABB IRB 120 robot to replace the UR5 robot in the ROSMED ros-slicer-integration project.