Issue with ProjectSemiLM

Hello, I am trying to use the ProjectSemiLM utility to project a field of pseudolandmarks across my dataset of Squamate wrist elements. I have manually placed 6 orientation landmarks on each element, and have selected one out of the sample to be the base mesh on whcih I generated the pseudolandmarks.

When I try this, I reveive the following warning message, which repeats for each elemetn I am trying to project the landmarks on:

“Warning: In vtkMRMLMarkupsNode.cxx, line 3166
vtkMRMLMarkupsFiducialNode (000002951C72DBB0): vtkMRMLMarkupsNode::GetMarkupPoint method is deprecated, please use GetNthControlPointPosition instead”

I was wondering if there was something wrong with the way that my orientation landmarks are formatted (I have them all saved as .mrk.json files, or if there was some other way I could address this.

Thanks,
Dalton

While annoying, it is a warning. You can ignore.

Are lms not projected?

No, the lms are not being projected, I had assumed that was why.

This traceback message also comes up:

Traceback (most recent call last):
File “C:/Users/Dalton/AppData/Local/slicer.org/Slicer 5.8.0/slicer.org/Extensions-33216/SlicerMorph/lib/Slicer-5.8/qt-scripted-modules/ProjectSemiLM.py”, line 198, in onApplyButton
logic.run(self.modelSelector.currentNode(), self.baseLMSelect.currentNode(), self.baseSLMSelect.currentNode(), self.meshDirectory.currentPath,
File “C:/Users/Dalton/AppData/Local/slicer.org/Slicer 5.8.0/slicer.org/Extensions-33216/SlicerMorph/lib/Slicer-5.8/qt-scripted-modules/ProjectSemiLM.py”, line 239, in run
subjectID = [int(x) for x in regex.findall(meshFileName)][0]
IndexError: list index out of range

Error suggest there is an issue with the number of landmarks in you samples, but I can’t replicate on my end.

Can you provide a sample test of your LMs?

Absolutely, I just sent you some of them via email. Thank you very much.

Just wanted to update that this issue was solved with the latest update to SlicerMorph! I still had to go in and change many of the file names in order to successfully process the entire dataset, but it worked!

-Thanks so much to the dev team!

1 Like