I think the setup is quite painless, especially if you just modify an existing scene that has the controller transforms sharing already configured. The setup could be also automated with a short Python script.
The main idea is that you run two instances of Slicer, each of them connected to a headset. Both instances contain the same data nodes (models, volumes, etc.) and their parent transforms are set as outgoing nodes in OpenIGTLinkIF module (which ensures the transform nodes are synchronized between the instances). The scene that we used in the video is available here.