Everything happens in the correct physical location, wherever the applied transforms placed the input or output nodes.
Yes, I see that you need an option to transform a node to a different coordinate system after the processing is completed; and I think this would be a useful feature.
This could be implemented in each tool, for each input and output node. But this would mean that we would need to have transformation selection options (e.g., choose between local, world, or a custom transform node) everywhere. This would complicate the implementation and GUI of all input and output nodes selection in all tools.
A similar approach was chosen with CLI modules: all applied transforms are ignored and you need to specify transforms that will be applied to input/output nodes. It did not work out well. Users don’t expect that applied transforms are ignored and it is not obvious what transform selectors are used for what nodes. You could make things a bit more intuitive with better GUI. For example, transform selectors could be placed next to node selectors they apply to. But it would not always be optimal, because sometimes you want to apply the same transform to multiple nodes.
The solution I recommend instead is much simpler to implement. Does not increase complexity in any of the tools, as it is a separate transform tool. The only disadvantage compared to what you propose (i.e., built-in transformation feature in every tool) is that end-users need to add one more tool if they want to have transformed input/output node. However, this could be addressed by improving the GUI, making it easier to add/configure tools.
This GUI improvement is in our mid/long-term plans: we plan to have a Model Editor, which will use the Dynamic modeler as processing engine for editing models similarly to the Segment Editor can edit segmentations. We could make model editing to be immediate (as in Segment Editor, MeshMixer, and in Blender sculpting tools), or parametric (as parametric CAD modeling tools and Blender modifiers).
There are no feedback loops of any kind, simply everything happens in the world coordinate system. All we do is to take into account applied transforms the same way as if they were hardened.
I would think magnitudes more people use Slicer for surgical planning than Blender, just because Blender is so extremely complicated. But we don’t have data, so there is no way to tell. I agree that those very few users (maybe a few groups in the whole world) may switch to Slicer if we have better tools.
However, majority of people use much simpler, single-purpose commercial tools; and the few percentage of clinicians who use research software prefer simple tools (such as MeshMixer), and only a tiny fraction of advanced clinical research users may choose Mimics or CAD tools. Slicer could compete with all these tools, but probably we could make the biggest impact by offering single-purpose tools (such as your BoneReconstructionPlanner extension, to offer alternative to single-purpose commercial software) and simple editing tools (to compete with MeshMixer and some of the simpler Materialize tools).
Please remind me what it is and where I can find it. The name reminds me of the TransformProcessor module in SlicerIGT. Does it have a similar purpose?