I think, this is expected for real-life motion, where you cannot ensure that the rotation happens exactly along a single axis. In general, an orientation difference cannot be described by rotation along a single axis, but you need to rotate along two axes (or around a point). If you have several (preferably dozens) of different poses then from those you can compute an average instantaneous axis of rotation, even if there are slight additional rotations (similarly how Pivot calibration module computes the rotation axis in SlicerIGT extension), but I’m not sure if this can be applied to just two poses.
@mholden8 Do you have any recommendation?