Projecting points to the positive external surface

Dear all,

I am trying to register and deform a surgical plate (roughly pre-registered using a few discrete landmarks) to adapt to the surface of the orbit. The plate has a thickness. You can see that some parts of the pre-registered unbended plate is “sinking” into the orbit.

The approach I am testing to adapt the plate to the orbital surface is to 1) sample same points at both the upper and undersurface of the plate, and 2) project the points at the upper surface of the plate to the orbit, and 3) run a registration between the points at the under plate surface to the points projected to the orbit, followed by a thin-plate spline (TPS) transformation (deform the plate at the same type).Though not perfect, it did adapt most of the plate to the surface of the orbit.

Points at the upper surface of the plate. Same points were also sampled at the under surface.

Points projected to the orbital surface from points plated at the upper surface of the plate.

Deformed plate adapt to the orbital surface based on register the points at the undersurface of the plate to the points projected to the orbital surface followed by a thin-plate spline transformation.

However, a small part of the plate was still under the surface. This is because a few points (31, 32, 44) at that region were projected to another side of the orbit due to the presence of a convex area at the medial orbital wall. Thus, a portion of the plate “sank” too deep under that convex area and got too close to under side of the orbital surface.

Is there any way to project points into a designated upper surface of the orbit?

The way I do the projection is by using the the runPointProjection() function based on the projectPointsPolydata in the ALPACA module of the SlicerMorph extension, which is about casting a ray along the normal of each point and find the intersection with the surface of another model (

The script I did was:

import ALPACA
logic = ALPACA.ALPACALogic()
plate_model_node = slicer.util.getNode("preregistered_plate")
lm_plate_upper_node = slicer.util.getNode("lm_plate_upper_surface")
orbit_model_node = slicer.util.getNode("orbit")

projectionFactor = 0.001 #I've been playing with the parameter.
lm_projected_to_orbit = logic.runPointProjection(plate_model_node, orbit_model_node, lm_plate_upper_node , projectionFactor) 

Orbit and plate and landmark files can be accessed at: - Google Drive

Thank you!


1 Like

Can you just extend the segmentation at the back of the orbit so that only the front surface is close to the plate?

Thanks! This sounds like a good idea, at least it would make it much more convenient for data analysis. I’ll give it a try soon. I also need to check with the surgeon if he wants the orbital wall particularly well segmented.

What I did to “lift” the plate out of the orbit was by registering the landmarks at the under surface to those at the upper surface and then transform the undeformed plate accordingly. Iterate the process for a few time would “lift” the plate up.

I was then able to project all the points from the plate to the upper orbital surface and run a TPS deformation to adapt the plane to the surface of the orbit (visually satisfied at least).

For repeatability, I am thinking about using something similar to Model-to-Model distance to calculate a scalar value at the plate (e.g., Python code model-to-model distance - #2 by smrolfe). If the scalar values become all negative, it means the plate has “lifted” to just a position just above the orbit. If this sounds reasonable, is there a way to calculate signed distance between particular landmarks and a surface?

1 Like

The example below should allow you to create a vertex-only polydata you could embed in a vtkMRMLModelNode with SetAndObservePolyData and then use Model-to-Model distance.

def __init__(self, pointlist=[]):
        points = vtk.vtkPoints()
        cellArr = vtk.vtkCellArray()
        Colors = vtk.vtkUnsignedCharArray()
        for p in pointlist:
            vert = vtk.vtkVertex()
            points.InsertNextPoint(p.x, p.y, p.z)
            cellArr.InsertNextCell( vert )
            col = clColor(
            Colors.InsertNextTuple3( float(255)*col[0], float(255)*col[1], float(255)*col[2] )
        polydata= vtk.vtkPolyData()
        polydata.SetVerts( cellArr )

But if you need spheres instead of vertices (because Model-to-Model module needs surface normals) you could use:

Hope it helps

1 Like

This looks great. Thank you very much!

1 Like

Several people reported that they could reliable extract a continuous inner surface of the orbital wall using Wrap Solidify effect in Segment Editor (provided by SurfaceWrapSolidify extension). Since it is simple, a single surface, you can snap your plate model directly to it, without any further processing, iterations, etc.

You need to set these inputs to the solidify effect:

You then get the complete, solid model of the orbital cavity in a couple of seconds. Even if the orbital wall is incomplete (which is very often the case, because it is a thin bone and the partial volume effect makes the bone density too similar to soft tissues), the extracted surface is complete and smooth:


Thanks a lot! I am able to generate an “endocast” of the orbital cavity by using paint brush to create a smaller segment within it and then use the Wrap Solidify to fill the cavity.

This tool definitely has lots of potential. It turns out that during surgery, they’ll just remove all the bones surrounding the fracture site and fit the plate to the area. The only hard requirement appears to be aligning the peripheral and the posterior stop (usually the palatine process deep in the orbit). The “endocast”, at least as far as I can tell, captured the peripheral very well, therefore it should be very helpful to use as a reference for adapting the plate. I’ll keep you updated once I get some feedback.

1 Like