3D Ultrasound Reconstruction with openIGTL (Verasonics & Polhemus)

US img: Verasonics 64 LE
Position Sensor: Polhemus EMT Sensor

Here’s what’s going on

  1. made Verasonics’ images available for import in Python.

  2. the values of the EMT sensor were imported into Python and a transform matrix was created.

  3. successfully sent a random image and transform matrix to 3d slicer (using openIGTL).

  4. We can now visualize the random image in real-time at the location. (Volume Rendering)

Here are the results so far

The result above is a random image I created for a simple experiment, floating in a random space.

By keeping the image the same and varying the transform matrix at 1s intervals, I was able to see the slices move in 3D space.

However, what I want is for the updated slices to continue to stack.

Currently, when a second slice is entered, the first one disappears.

That is, the update is made.

I want the slices to continue to accumulate.

PLUS is difficult to use because it does not support verasonics devices (I think).

I Googled a lot of different combinations of words, but never found a clear answer.

However, I’ve seen a few people struggle with loading data from verasonics and polhemus.

If this is you, I can help.
Contact me at gu_hong3648@naver.com and I’ll help you out.

I would recommend to use the “Show slice in 3D views” button (eye icon) to display the image slice in 3D instead of using volume rendering for a single slice. Volume rendering is much more expensive and more difficult to display an opaque image.

You can use volume reconstruction module of SlicerIGT extension to “stack” (it is much more complex than that, as images may be oriented arbitrarily) the images into a 3D volume. See step-by-step instructions in SlicerIGT Tutorial U-34.

Note that the diagonal line (and the gray area at the bottom of the image) indicates that you have not set a correct image width in the OpenIGTLink image header.

i got it!
i love you

1 Like

Great! You may consider writing a “Success story” topic about this (just a few sentences and the image that you included above).

Hi, can I use Polhemus EM trackers in SlicerIGT?

As far as I know, the situation is still the same as described here:

Thanks for the reply. But I wonder, since we already have a Polhemus tracker, is it possible to implement an OpenIGTLink interface for Polhemus ourselves? Is there a guide for doing this?

A quick&dirty bridge can be put together in Python, which converts Polhemus VRPN messages (received by some VRPN Python package) to OpenIGTLink messages (sent by pyigtl).

Thanks. Is there any documentation on the definition of the sequence metafile, or any API for programmatically constructing a sequence metafile? We want to write ultrasound images and their corresponding tracker data into a single sequence metafile so that we can VolumeReconstructor command in Plus to reconstruct a 3D image.