Does lower tolerance spacing and having more landmarks make it easier to get a statistical shape model?
That’s not an easy question to answer. Some papers suggest more dense sampling of the geometry does help. But then some statistical procedures become computationally unfeasible if you are using thousands of points.
For mammal skulls we typically shoot for low few hundreds. For me it is far more important how regularized and smooth the points are then the absolute number of points.
Thank you very much, doc. Are there any papers that give information about how to make the landmarks smooth and regular? Please give some advice on how to set good landmarks.
I am not aware of any papers specifically on that. That’s usually a function of how the the underlying 3D model is derived, whether the vertices are roughly equally distributed on the model etc… If you run one of your models with PseudoLMGenerator and choose the “model geometry” option, you can quickly see what the point distribution looks like and whether you need to smoothing and edit your model more (or not).
Thank you very much for your kindness Professor Murat Maga