There seems to be an issus with the way PETDICOM module operate on PET images taken from a a phillips machine.
The computed SUV are way too low. By looking into the hood, it seems the right way to compute SUV values in this case would be to multiplicate each pixels by the value inside the private DICOM tag 7053|1000. But I guess the PETDICOM extension doesn’t take that into account ?
yeah, I can more or less confirm that the PETDICOM module compute a wrong (but coherent) suv normalisation factor when feeded with phillips images.
The way to do it is to load the unweighted scalar volume and multiply everything by the value found in the private dicom tag 7053 | 1000. By doing this i find coherent SUV value for the liver.
Maybe it could be added as an option inside the module ?
Thanks for the report It’s very possible that Philips encodes PET differently than the examples used in developing the extension. If you could file an issue (either in the Slicer github, or if you used one of the PET extensions, then in the appropriate repository), and ideally include an anonymous sample study for development and testing that would be great. Even better, if you are a coder you could probably easily add a branch to handle the special cases for this kind of acquisition.
As you probably know, vendors often have very specific and non-standard ways of encoding data. In this case, the group (7053) is an odd number, meaning this is a so-called “private tag” that is not part of the dicom standard and needs to be handled as a custom-coded special case.
It is known schema, the Units (0x0054,0x1001) should be CNTS, BTW. SUVbwScaleFactor is in fact already calculated and saved in that Philips private tag and, as correctly mentioned above, SUV should be calculated: SUVbw = ((stored pixel value in Pixel Data (0x7FE0,0x0010) * Rescale Slope (0x0028,0x1053) + Rescale Intercept (0x0028,0x1052) ) * SUVbwScaleFactor
Readme of Slicer-PETDICOMExtension explains The PET DICOM Extension provides tools to import PET Standardized Uptake Value (SUV) images from DICOM into 3D Slicer. SUV computation is based on the vendor-neutral "happy path only" calculation described on the Quantitative Imaging Biomarkers Alliance (QIBA) wiki page Standardized Uptake Value (SUV)
Dear Tristan. I’m a NM doctor from Qazaqstan using 3D slicer to segment and calculate PET scans. All patients scanned with Phillips machines and in some patients SUV calculated incorrect. Some patients have additional files like on image.
I’m a NM doctor from Qazaqstan using 3D slicer to segment and calculate PET scans. All patients scanned with Phillips machines and in some patients SUV calculated incorrect. Some patients have additional files like in image.
Can you please help me to manage it? In a more or less simple way, since I’m not an IT guy))
Also have this issue on DICOMs acquired with a Philips scanner and PETDICOMExtension, which is problematic because other modules use the calculated SUV values. Is there a reason that the “happy path only” SUV calculation option was used? Is it possible to choose the alternative vendor neutral path which first determines whether Units (0x0054,0x1001) are BQML or CNTS and then performs the appropriate calculation (with or without scale factor)? Alternatively is it possible to introduce an option for SUV calculation technique within the module? Thanks for great tool.
For the reasons described above it’s hard to test on the full spectrum of real-world PET data so the module implements best-efforts for the data and information the developers had at the time. It would be great for the community to chip in and improve the code to work robustly on a broader range of data.
Hi Steve
Very willing to take a crack at it though my coding is pretty shaky. I see the PETDICOMExtension calls SUVFactorCalculator, which I assume runs the happy path SUV calculation and returns the real-world values? I’m not sure that the calculator can be edited in Python? Would be grateful if you or @fedorov could advise? Thanks for assistance!
I do not know who, if anyone, from the original team is still supporting that extension (it’s been a long while since the grant funding that development finished), by let’s try pinging Christian @chribaue.