Export dicom files colored from preset settings

Dear Slicer team,

Best wishes for 2019!

The preset that Slicer offers automatically adds colors to the loaded dicom files. These colored presets, I would like to export and visualize in Virtual Reality. I have a tool for that which stacks the dicom files and visualizes the images in VR.

Currently, the dicom files are still black, grey and white when I export them with a selected preset. Is it possible to get an export of the dicom files colored?

I also tried to color the scans through the editor mode and then export the dicom files, but there I unfortunately also get a colored export.

Can someone please help me get colored dicom output?

On another note, if anybody has any tips or tricks they want to share concerning virtual reality, please let me know!

Look forward to receiving a response.

Kind regards,
Chris

You can show images (even 4D volumes, replayed in real-time) using volume rendering and you can interact with the scene, move objects, etc. directly in Slicer, using SlicerVirtualReality extension. See more information here: https://github.com/KitwareMedical/SlicerVirtualReality

More features are added continuously, but if you need any specific functionality then let us know. Also, any contribution would be very welcome.

If you want to just show segmented models in virtual reality (you can segment all that you need and the extra effort required for segmentation is acceptable) then you could use any software (Unity, Unreal, etc) or implement native applications from scratch. You can even add volume rendering. However, if you are considering implementing virtual reality applications for medical imaging, then Slicer is by far the most powerful and versatile platform for that.

2 Likes

Dear Andras,

Many thanks for your quick response. Great to hear that Slicer offers these extensive VR applications. Will definitely dive into those options.

If I would like to have coloured exports based on the presets, is that possible? It would really be of added value to me and I am willing to financially contribute.

Thanks in advance,

Chris

This is discussed extensively here: Save volume rendering as STL file - #14 by maxabernathy. In short, volume is saved as is (with single-component scalar values) and you color it using transfer functions (scalar opacity transfer function, scalar color transfer function, and sometimes also gradient opacity transfer function). Transfer functions are saved in a .vp file when you save the scene (simple text file that list number of points in the transfer functions, followed by the point coordinates). You can find information about these transfer functions and how they are used for volume rendering in VTK textbook.

Hello, and happy 2019!!
I am also interested in viewing volume rendering in mix reality HMD, specifically I am trying to get the Magic Leap AR Headset connected to Slicer. Any general advice about the path I should take?

many thanks
Amir

Short answer:

  • If possible, use virtual reality instead of augmented reality: it is already well supported within Slicer and much more mature in general.
  • Augmented reality is not ready for real-world use yet, but you can implement quick prototypes using Unity for early feasibility tests. Slicer can be used to create surface models that these prototypes can use and. We plan to have OpenIGTLink-based real-time transfer of meshes from Slicer to Unity-based applications (not ready yet, contributions are welcome).

Long answer:

First question is why would you connect a HoloLens (or a Magic Leap, which can do essentially the same) to Slicer. What application for augmented reality do you have in mind?

We’ve been evaluating HoloLens for various clinical applications for 2 years (burr hole location planning - currently tested in a study in the OR on patient cases, surgical skill training, anatomical training for needle insertion, etc.) and we find that while the technology is very promising, current headsets have still very significant limitations. The most promising use of augmented reality would be in situ visualization within arm-length distance, but unfortunately none of the current headsets (HoloLens, Magic Leap, Meta, etc.) can do that, due to focal plane placed at around 100-300cm in instead of 40-70cm: you cannot see virtual objects and real objects at the same time (you lose sense of depth, so you cannot align shapes in 6dof). There are secondary issues, such as instability of virtual objects (you often get errors of 3-5mm error when you move around an object), size and weight of headset, and lack of computational power on untethered devices.

HTC Vive Pro has video pass-through capabilities, which allows using all the virtual reality infrastructure for augmented reality (SlicerVirtualReality could be used for this). However, image quality, lag, fixed focal distance, dynamic range under strong focused OR lighting might be problematic on video pass-through augmented reality.

Can you use virtual reality instead?

If you don’t need in-situ visualization at arm-length distance then you may just as well use virtual reality. It does not have any of the limitations listed above, they are ready-to-use for several end-user applications, inexpensive, and a single software interface (OpenVR) can be used for all major headsets (HTC Vive, Windows MR, Oculus Rift).

What visualization would you need?

If you only need rendering of surface meshes then you can use simple Unity applications to render them and implement simple interactions. Since headsets are not yet ready for real-world use anyway, these quick throw-away prototypes are appropriate. We’ve been working on implementing OpenIGTLink interface for sending over segmented models from Slicer to Unity-based applications (so that you don’t need to build and deploy a new application for each patient case), but it’s not ready yet.

Volume rendering is feasible, too, you can buy volume renderer from the Unity asset store for a few ten $. Computational capabilities of untethered headsets are limited and these volume renderers are not as sophisticated as VTK’s volume renderer, but might be OK for some applications.

Many thanks for the informative answer.
We are using Magic Leap to build a prototype for Needle Insertion .

We want real time Ultra Sound 3D Volume projected via the magic leap on the insertion area.

Are you sure we will have focal distance limit with the magic leap?

thanks!

Magic Leap has a focal plane at 1m and 3m, so you won’t be able to align a needle with a displayed trajectory in 3D while holding the needle in your hand at 50cm distance. Of course, try it and see it for yourself. You don’t need to develop any custom software, just show a static model of a tube and try to align a needle with it.

What we do now as a workaround is that we let the user easily show/hide the virtual needle trajectory by pressing a button. The user looks at the hologram (focusing at a distant plane) then quickly hides the trajectory and looks at the physical object (focusing on the object). We’ve found that overall these virtual guides help, but we need a headset with focal plane at 50cm if we want these virtual guides to work at least as well as laser guides (that you can see continuously, while looking at the patient and needle).

Hopefully the HoloLens2 or some other headsets in the near future will come with a much closer focal plane. There may be additional problems to be solved for high-quality, stable hologram visualization at arm-length distance, since tracking errors may be more visible, view may be more occluded, etc., so maybe this won’t be available very soon.

Dear Andras,

Many thanks for your response.

The tutorial for Segmentation for 3D printing requires to make specific segmentations before you can download them. I would like to see if I can download the preset (in full colors) like below to see if I can view them in Unity.

If I select the preset and then export, I logically export an empty .stl file. Can you please tell me how I can export the preset?

image

How can I make a financial contribution to your team?

Kind regards,
Chris

The presets are described in the following files. See Slicer/Modules/Loadable/VolumeRendering/Resources/presets.xml at main · Slicer/Slicer · GitHub

If you are looking to work with a team to implement a specific functionality, you could have a look at the list of commercial partners. See CommercialUse - Slicer Wiki

Or are you looking to make a “donation” to help us maintain the infrastructure and support the project overall ?