Unpaired Stain Transfer Using Pathology-Consistent Constrained Generative Adversarial Networks

Pathological examination is the gold standard for the diagnosis of cancer. Common pathological examinations include hematoxylin-eosin (H&E) staining and immunohistochemistry (IHC). In some cases, it is hard to make accurate diagnoses of cancer by referring only to H&E staining images. Whereas, the IHC examination can further provide enough evidence for the diagnosis process. Hence, the generation of virtual IHC images from H&E-stained images will be a good solution for current IHC examination hard accessibility issue, especially for some low-resource regions. However, existing approaches have limitations in microscopic structural preservation and the consistency of pathology properties. In addition, pixel-level paired data is hard available. In our work, we propose a novel adversarial learning method for effective Ki-67-stained image generation from corresponding H&E-stained image. Our method takes fully advantage of structural similarity constraint and skip connection to improve structural details preservation; and pathology consistency constraint and pathological representation network are first proposed to enforce the generated and source images hold the same pathological properties in different staining domains. We empirically demonstrate the effectiveness of our approach on two different unpaired histopathological datasets. Extensive experiments indicate the superior performance of our method that surpasses the state-of-the-art approaches by a significant margin. In addition, our approach also achieves a stable and good performance on unbalanced datasets, which shows our method has strong robustness. We believe that our method has significant potential in clinical virtual staining and advance the progress of computer-aided multi-staining histology image analysis.