Custom module that tracks the orientation and position of arUco markers in real time

Hi,

I’m currently writing a custom module that tracks the orientation and position of arUco markers in real-time, but I haven’t figured out how to track the markers in real-time. Could you give me some advice on what I can do to achieve this?

Thanks,
Mogbekeloluwa Adesiyun

Hi, this is already implemented in PLUS: Plus applications user manual: Optical Marker Tracker
There is a tutorial on how to use it in this repository: GitHub - PerkLab/PerkLabBootcamp: Materials for the yearly PerkLab bootcamp course
Look at Doc/day2_Plus.pptx
PLUS communicates with Slicer though the OpenIGTLinkIF module. Using PLUS instead of implementing a direct interface to hardware in Slicer keeps Slicer less dependent on hardware-specific libraries.

Hi,

Thanks for your reply. The Optical Marker Tracker shown in the link you sent tracks the markers, but the Tracker doesn’t record the position of the markers at any given time. I’m trying to develop an automatic way of tracking the orientation and position of the markers in real-time; so, would you be able to point me in the right direction?

Thanks,
Mogbekeloluwa Adesiyun

All PLUS optical tracker devices compute the positions of optical markers relative to the camera by default. PLUS can also compute their positions relative to each other, based on the transform name you specify in the config file. Using the OpenIGTLink server in PLUS, send to Slicer whatever makes most sense in your application. PLUS will compute the transformation between markers based on the transform name you specify in the server part of the PLUS config file. Tracking information will be sent to Slicer from PLUS in real time. You don’t need to code anything to use the marker positions in real time in Slicer.

You may add a CaptureDevice in the PLUS config file to record real time data into files without Slicer. But I recommend using the Sequences module in Slicer to record time-series data. Slicer Sequences gives you more options when you need to replay this data (pause, rewind, etc.).

The PerkLab bootcamp tutorials (link in my previous message) provide step-by-step instructions how to do these.

Okay, I will look into that. Thanks

I went through the webpage you sent, but I don’t think that effectively solves my problem. I’m trying to figure out the lines of code to add to a Slicer module that will ensure I have access to a live scene. I already know the algorithm that will extract the position of the arUco markers from the transform node, so I just need to find out how to use a live scene in the module. Any pointers on what I can do?

Hey just following up to see if you have any more advice regarding this topic