A probabilistic conditional adversarial neural network to reduce imaging variation in radiography

Medical images can vary due to differences in imaging equipment and conditions. This variability negatively can impact the consistency and accuracy of diagnostic processes. Hence, it is critical to decrease the variability in image acquisition to achieve consistent analysis, both visually and computationally. There are three main categories that can contribute to image variability: equipment, acquisition protocol, and image processing. The purpose of this study was to employ a deep neural network (DNN) method to reduce variability in radiography due to these factors. Given radiography images acquired with different settings, the network was set up to return harmonized images, targeting a reference standard. This was implemented via a virtual imaging trial platform, utilizing an X-ray simulator (DukeSim) and 77 anthropomorphic, computational phantoms (XCAT). The phantoms were imaged at 120 kV at four different dose levels with DukeSim emulating a typical flat panel radiography system. The raw radiography images were then post-processed using a commercial algorithm at eight different settings resulting in a total of 2464 radiographs. For each XCAT, the reference standard was defined as the noise-less and scatter-less radiography image with image processing parameters based on a radiologist’s preference. The simulated images were then used to train and test the DNN. The test set resulted an average structural similarity index greater than 0.84, and an 𝐿1 error less than 0.02, indicating the harmonized images were visually and analytically more consistent and closer to the desired reference appearance. The proposed method has great potential to provide for effective and uniform interpretation of radiographic images.