We (and several other groups) used the HoloLens for burr hole placement (see paper, video). Since minimum user interaction was needed (patient registration and after that just show/hide the skin surface, brain, and planned drill hole) and Unity already supported the HoloLens, we decided to use Slicer for creating the models and use Unity for displaying them in AR. After initial feasibility was demonstrated on dozens of phantom studies and 15 patient cases, we put the project on hold, because while the system worked for this simple, non-demanding clinical application, we were not confident that currently available technology can be effectively used for more difficult procedures (where higher accuracy and more complex user actions are needed and an AR system could have significant clinical utility).
Since we did not proceed further from initial feasibility, we did not complete our live Slicer/Unity bridge for sending of models and transforms to Unity from Slicer. Still, you might find bits and pieces of the software that we developed useful: HololensQuickNav, OpenIGTLinkUnity
Thomas Muender and his team from Uni Bremen worked on a Slicer/Unity bridge, too. See
Project week page and repository.
@Amine_Ziane recently asked about using Unity for zSpace device - see transfer scene files from 3DSlicer to Unity3D - #17 by Amine_Ziane. Maybe you can try to work together.