Hi,
I recently came across 3d slicer and I was wondering if this software is capable of assisting me with a research project. I would like to do real time 2d ultrasound scanning and 3d volumetric reconstruction in virtual reality using a meta quest 3 vr controller as the spatial tracker for the ultrasound probe. Is this currently possible using 3d slicer and its extensions??
I should clarify that I am interested in doing this using quest 3’s augmented reality mode and 3d printing a mount for the quest controller that will attach to the ultrasound probe.
attach a quest 3 controller to an ultrasound probe using a 3d printed mount (in order to use the controller as a tracker in 3d slicer for the ultrasound probe)
Put on the quest 3 in passthrough mode (AR)
Slowly live scan a model with the ultrasound probe, displaying each slice superimposed over the model in AR directly below the ultrasound probe
Once a sufficient 2d scan is collected, have 3d Slicer then do a 3d volumetric reconstruction of the images and display the 3d reconstruction in AR superimposed over the model (below the ultrasound probe) instead of just displaying the 2d ultrasound images.
Hope that makes sense. Thanks again for your help.
I’m not sure how practical it is to attach a controller, but it is possible, if you want to avoid the need of an external tracker. Maybe Meta manufactures smaller hardware that supports this better. In any case 1) there needs to exist a reference coordinate system fixed to the “patient”, and 2) the US probe needs to be calibrated in reference to the tracked controller to obtain the ControllerToImage transform
The Kitware team must know where the OpenXR support stands currently in Slicer. I haven’t tried this feature and not following the efforts on the commit level so cannot give you an answer whether this is already possible or not
For tracking and volume reconstruction using the VR controller as tracker you don’t need any developments, it all works, just enable the controller transforms in the Virtual Reality module in Slicer. For volume reconstruction (and for meaningful display of the ultrasound image) you need to figure out the transform between the controller’s coordinate system and the image. You can do this visually or by using any ultrasound probe calibration tools (see SlicerIGT tutorials).
That said, the VR controllers are very bulky and inaccurate. The good visual alignment of the controller and its AR rendering is misleading: the actual absolute tracking accuracy is quite bad compared to even an inexpensive tracker like the OptiTrack Duo.
For augmented reality display using Meta Quest 3, you need to improve the Virtual Reality module in Slicer: the module needs to request the camera stream from the headset and then display those images in the view’s background. Since Microsoft discontinued the HoloLens2, it is quite likely that new projects will use Meta Quest 3 for augmented reality (due to quality, availability, and price) and eventually somebody will implement this, but I don’t know about anyone working on this right now.
We don’t have anyone working on VTK rendering in passthrough mode at Kitware Europe currently, but we have submitted a few open calls for projects to do so recently (including integration in SlicerVR). I’ll make sure to let you know if things move forward.
When using Quest 3, I think you must use Air Link as opposed to connecting it using the Quest Link cable, otherwise it will request OpenGLES as the rendering backend which is not supported by SlicerVR as far as I know.