Different coordinate systems when exporting stl files from 3D Slicer to matlab

Hej,

I am using 3D Slicer 4.8.1 to create models of muscles of the lower leg. I just stumbled upon a problem which is the following.

I export the models from slicer as .stl files. I then read and display those files into Matlab using the function stlread. Additionally, the images I use for creating the models contain markers. I manually read out the position of those markers using the data probe in 3D slicer. My goal is to create a plane in matlab going through the marker positions and cutting my volume as well. The plane is created in the position to be expected within matlab but the stl volume appears in a completely different position.

I’ve read that there were other people having problems with the coordinate system when exporting stl models but I couldn’t find out how to solve this when importing the volume into Matlab. Is there a way to maintain the slicer coordinate system when exporting the models? Or is there a transformation that can be done after exporting models to transform the model back to its slicer coordinates?

Thank you very much in advance!

Kind regards,
Clara

Hi Clara -

This page describes the way Slicer manages coordinate systems:

https://www.slicer.org/wiki/Coordinate_systems

With the most recent versions of Slicer (the current nightly) there’s also an export widget that lets you chose the space for the exported models:

Hope that helps,
Steve

Also note that You can run your Matlab functions directly from Slicer using MatlabBridge extension:
https://www.slicer.org/wiki/Documentation/Nightly/Extensions/MatlabBridge

Could you describe the end goal of your project? You may be able to do most processing in Slicer and do only you very specific processing in Matlab (or even port that to Python for completely removing dependency on proprietary software).

Thank you for all the suggestions!

The general aim of my project is to determine muscle parameters such as fascicle length and pennation angels from DTI data of the lower leg. For this I use DSI studio for fascicle tracking and overlay the resulting tracts with the segmented muscle models within matlab. A custom written matlab program then determines, among other things, the correct end points of the fascicles based on their intersection with the anatomical muscle boundaries.

The problem I have described now is part of the project where I am trying to compare muscle parameters from US data with parameters from DTI tracking to see if DTI tracking yields reasonable results. For this markers were applied on the leg which can be seen on the MRI images. The same marker points were tracked during the US image acquisition using a vicon system. The goal is to find the intersection of the 2D ultrasound plane with the 3D muscle model. This information is then used to track fascicles in the location of the intersecting plane within the muscle to compare US parameters to DTI parameters.

Before using 3D Slicer my research group used the software mimics for segmentation which is why all the matlab code for transformiations between the US and MRI coordinate systems as well as muscle parameter measurements already exist. I only now realised that the stl models created by 3D slicer and displayed in matlab are in a different coordinate system than the data within 3D slicer. Using the data probe in 3DSlicer to obtain the marker positions thus can’t be used for the US to MRI coordinate transformation if the model is not in the same coordinate system.

Additionally, I have noticed that my data gets mirrored along the y-z plane when importing it into 3D slicer (the left leg appearing to be the right and vice versa) which might also be part of the problem. This also happens when displaying it in itk snap but not when using mricro. It’s not directly connected to my question maybe but I thought it’s worth mentioning.

Thanks again! Please, let me know if anything is unclear.

Best, Clara

I’ve just noticed that I have written an answer a month ago but have not posted it… Here it is:

Thank you very much for writing about your project, it sounds very exciting.

You might be interested in SlicerIGT extension, which can connect to a wide range of tracking and ultrasound imaging systems and display positions and images in real-time in Slicer, record and replay synchronized data streams, reconstruct 3D volumes from tracked 2D ultrasound, etc. It can also do ultrasound spatial calibration, automatic temporal calibration (of image and tracking data streams), and various other calibrations. We don’t support Vicon tracking systems, but instead the magnitude less expensive OptiTrack - for example, an OptiTrack Duo costs $2300; and a number of medical-grade optical and electromagnetic trackers (NDI, Ascension, Claron, etc.) and other sensors - see complete list here.

I think the only difference should be the LPS/RAS coordinate system switch, which you have already noticed (x and y coordinates are inverted; which corresponds to a 180-degree rotation or two mirrorings). If you notice anything else then let us know.

I would not recommend to use the Data Probe for this purpose. It is difficult to get accurate 3D position just by hovering the mouse over a position then typing coordinate values. Markups module is developed for the purpose of marking landmark points in images and models. You can zoom in, adjust the position of markers in orthogonal planes, save the positions into a csv file. You can also send markup point positions directly to Matlab using MatlabBridge. Note that you can register CT to ultrasound coordinate system using Fiducial Registration Wizard module of SlicerIGT extension, which allows you to see alignment in real-time, supports rigid, affine, and warping registration, can deal with missing and arbitrarily ordered landmarks, supports data point recording directly from the tip of a tracked pointing tool.