Display normal while placing fiducial?

Is it possible (i.e., using VTK?) to display the current surface normal while placing a fiducial? I was chatting with an anthropologist who mentioned older applications used for landmarking 3D specimens. Some of them are able to interactively display the current surface normal while moving a landmark (fiducial) point on a surface. Sounds like this is considered a pretty useful check for the proper positioning of landmarks in sutures, etc.

Thanks!

-Hollister

You can easily implement this (probably 20-30 lines of code): add an observer to the markups node, whenever it changes, get the point coordinates, use it as input for a probe filter that samples the surface, and use the result as an input to a glyph filter. The glyph filter output can be displayed as a polydata.

If this turns out to be a commonly needed feature then it can be nicely implemented as a custom markups measurement plugin.

I will take a look. I will most likely (almost certainly) have questions… Dumb first question - this is in the C++ portion of the codebase, correct?

Thanks!!

-Hollister

Markups module is implemented in C++ but you can implement everything that I described in Python, just by connecting a few VTK filters.

1 Like

Gotcha. Thanks, Andras!

Hope you are well.

-Hollister

Not entirely sure about this. While the old IDAV Landmark Editor did indeed display the normal at selected mesh vertex, this information is not used anywhere in the analysis. It was mostly a way to remedy the poor rendering and performance of the software (sometimes points will fall inside mesh due discontinuity, and with a large arrow glyph it is easier to drag and move around the point than an invisible sphere).

Hi Murat! I hope you are well.

Thanks a lot for this feedback! Yeah, the person in question is a long-time user of Landmark Editor. I brought up SlicerMorph during a 1-week class on Geometric morphometrics at AMNH, and showed them a bit of SlicerMorph and your overview YouTube video - he was very interested in converting to SlicerMorph.

I’ll pass this info along to him - do you mind if I bring him in on this conversation?

Thanks!!

-Hollister

Sure, but if these are specific to SlicerMorph, perhaps better to start a new thread with that tag (if you are also seeking community input).

Alternatively, if they are more of a feature request specific to SlicerMorph submit them as issues on our Github repo (GitHub - SlicerMorph/SlicerMorph: Extensions to conduct 3D morphometrics in Slicer)

It is better to keep the discussion on the forum, as markups are very actively developed, we are still making many design decisions in these weeks, so it is good if we hear about user needs (to ensure we choose design options that make it easier to implement them in the future).

@sunderlandkyl is working on the new ROI (box, sphere, curved box) widget. For this, we will improve interaction handles, which currently only supports translation and rotation along the centerpoint, but will support additional handles (to edit ROI/slice intersection).

@RafaelPalomar is working on making the widgets customizable/extensible in extensions. It should make it easy to add a custom widget representation class in Python. Displaying an arrow at each markup point could be a custom widget representation.

Since we already have orientation as part of markups, displaying markups with an oriented widget should not be too hard to implement in Slice core either.

And there are also markups measurements, which we are improving and could be used to additional pieces of information in views.

So, just keep the discussion visible here on the forum and describe what you need - we are listening.

2 Likes

Yep, totally agreed - he’s keen on joining in on discourse. I was mostly wondering (if Murat mentioned) if this was more a SlicerMorph issue or a generalized Slicer markups issue. Sorry if that wasn’t clear - early morning post before I’d finished my coffee.

Thanks!

-Hollister

@lassoan @smrolfe This issue, i.e. ability to see the normal vector where a control point is placed on a 3D model, came up during a slicermorph training, looks a couple of our users want to see this functionality. Can we do a prototype?

This feature is available now for planes. Would that be sufficient?

Do you know what the normals would be used for?

1 Like

I think it would be sufficient. Would continue to update the normal once a point is placed, if the position of the point modified? (in the plane tool it doesn’t, which makes sense).

I think they want to use it as a visual guide to understand how curvature is changing and help them position the curves more accurately on the models. I personally do not see the value, but it is a common request for people migrating to SlicerMorph from old IDAV Landmark Editor software.

I believe the idea is to use the normal to see when a point sliding along the surface begins to drop into a suture between two bones. (Researcher in question is an anthropologist.)

In thinking about this more, maybe you’d actually be better off rotating your slice views to do this and placing landmarks in the slice view. This may be more of an issue of adapting to slicer’s workflow rather than adding features? Just thinking out loud.

While that’s an option, it won’t work well with 3D models, only with volumes. Most people doing semilandmarking would be using a 3D model.

Oh, yeah, I see - so they would only be using a model, not a model and the volume dataset. I basically never use only models, so didn’t realize that.

@lassoan are you planning to expose this for pointlists?

We store normal for point lists and that normal could be used for orienting the point glyphs. It would not be hard to implement this (maybe 1-2 weeks of work for an experienced Slicer developer). However, I am not aware of any plans for adding this feature.