I’m currently writing a custom module that tracks the orientation and position of arUco markers in real-time, but I haven’t figured out how to track the markers in real-time. Could you give me some advice on what I can do to achieve this?
Thanks for your reply. The Optical Marker Tracker shown in the link you sent tracks the markers, but the Tracker doesn’t record the position of the markers at any given time. I’m trying to develop an automatic way of tracking the orientation and position of the markers in real-time; so, would you be able to point me in the right direction?
All PLUS optical tracker devices compute the positions of optical markers relative to the camera by default. PLUS can also compute their positions relative to each other, based on the transform name you specify in the config file. Using the OpenIGTLink server in PLUS, send to Slicer whatever makes most sense in your application. PLUS will compute the transformation between markers based on the transform name you specify in the server part of the PLUS config file. Tracking information will be sent to Slicer from PLUS in real time. You don’t need to code anything to use the marker positions in real time in Slicer.
You may add a CaptureDevice in the PLUS config file to record real time data into files without Slicer. But I recommend using the Sequences module in Slicer to record time-series data. Slicer Sequences gives you more options when you need to replay this data (pause, rewind, etc.).
The PerkLab bootcamp tutorials (link in my previous message) provide step-by-step instructions how to do these.
I went through the webpage you sent, but I don’t think that effectively solves my problem. I’m trying to figure out the lines of code to add to a Slicer module that will ensure I have access to a live scene. I already know the algorithm that will extract the position of the arUco markers from the transform node, so I just need to find out how to use a live scene in the module. Any pointers on what I can do?