If you computed the coordinates correctly then probably the difference is due to interpolation. You can use nearest neighbor instead of interpolation if you want to see exactly matching values.
However, if you want to extract orthogonal slices along a curve then you don’t need to implement any new computations. Instead, you can use Curved Planar Reformat module in the Sandbox extension. It gives you all the reformatted slices conveniently in a 3D array.
However, we have observed another strange behaviour. The images saved as .npy with name “index” correspond to a flyTo(index+1) of the endoscopy module and we don’t know why there is no index correspondence (see code in the previous post).
Rendering pipeline update is limited by the screen refresh rate, while data may be modified thousands of times per second. If you capture data from the rendering pipeline then make sure you call slicer.util.forceRenderAllViews() before. The rendering pipeline is not intended for acquiring data for quantitative analysis, as it is optimized for visualization. Currently, there are not significant limitations (other than lower refresh rate and image resolution and field of view limited to display resolution at later stages of the pipeline, and your data potentially impacted by user’s visualization choices such as interpolation/nearest neighbor), but in the future we may implement features like dynamically lowering of rendering resolution when visualizing very large data sets, etc.
Reslicing/resampling the original image is a more robust way of extracting pixel values from reformatted slices. The non-linear transform based method that Curved Planar Reformat module uses is particularly powerful because it allows transforming any nodes between the world coordinate system and the straightened coordinate system.