Is it possible to create a model of a body with all the underlying organs, etc and then import the model into Unreal game engine for further interaction between user and the simulation? And how would the importation be done? This is meant to be done in Virtual Reality, so ideally the user will wear the headset, see a body and be able to interact with it, and the interactions and consequences are programmed in Unreal. I know there’s an extension SlicerVR, but do not think it’s sophisticated enough to handle too much interaction between the model and the user?
Hi -
You can export models in various formats for use with other systems like Unity. Lately there’s some work with glTF, but other formats should work too.
A lot depends on what you are trying to accomplish. SlicerVR allows you to directly interact with Slicer functionality so things like segmentation, transforms, and volume rendering are all native. Plus Slicer is fully open source so it’s possible to program anything you need. But of course it’s newer and has a smaller developer base so it’s not as polished as something like Unity.
In general, it makes sense to use SlicerVirtualReality if you implement a medical image viewing/analysis application, while game engine is more suitable if you implement a game-like application.
See these slides from this recent SlicerVirtualReality presentation:
Yes, it’s definitely possible to create a detailed anatomical model and import it into Unreal Engine for use in a VR simulation with user interaction. Here’s a general overview of how this can be done:
- Model Creation
You would typically start by creating or acquiring a high-quality 3D anatomical model. This can be done in software like Blender, Maya, or 3ds Max, or sourced from medical visualization tools like 3D Slicer or ZygoteBody. Models can include bones, organs, muscles, etc., layered appropriately. - Model Preparation
Before importing into Unreal, you’ll need to optimize the model:
- Reduce polygon count where possible to keep performance smooth in VR.
- Create clean UV maps and textures (or use PBR materials).
- Separate parts that need to be interacted with individually (e.g., heart, liver) into their own mesh objects.
- Rig the model (especially if organs or systems will move or respond to input).
- Export to Unreal Engine
Export the model in a format compatible with Unreal, such as FBX or OBJ. Unreal Engine works best with FBX for skeletal meshes and animation. - Import into Unreal Engine
Use Unreal’s Content Browser to import the FBX. Set up materials and physics if needed. You can then:
- Add collisions.
- Set up interaction blueprints (e.g., grabbing, rotating, dissecting).
- Use VR plugins like the OpenXR framework or SteamVR for input handling.
- Interaction Logic
You’ll program interactions using Blueprints or C++, depending on complexity. For example:
- When a user touches an organ, highlight it or show a label.
- Simulate surgical procedures or disease progressions based on interactions.
- Provide real-time feedback based on user input (e.g., incorrect incisions, physiological responses).
- VR Setup
Set up the VR pawn, hand tracking, and interaction system. Unreal’s VR Template is a good starting point, and plugins like VR Expansion Plugin can help extend functionality. - Alternatives to SlicerVR
You’re right that SlicerVR is more visualization-focused. For advanced interaction, Unreal Engine is far more flexible. If you’re working with DICOM data or want to stay in the medical ecosystem, use 3D Slicer to prep the model, then export it for Unreal as described above.
In short, yes—it’s feasible and quite common in medical simulation projects, though it requires a mix of 3D modeling, optimization, and Unreal development. Let me know if you want references or example projects.