TY - GEN
T1 - Conditional Diffusion-Based Virtual Staining
T2 - 27th International Conference on Pattern Recognition Workshops, ICPRW 2024
AU - Er, Xuanhe
AU - Khattab, Mahmoud
AU - Liao, Iman Yi
AU - Ahmed, Amr
AU - Pan, Jia Wern
AU - Makmur, Haslina
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.
PY - 2024/12/1
Y1 - 2024/12/1
N2 - The development of advanced image-generative modalities has significantly improved digitalized histopathological diagnostics. Despite its limitations, hematoxylin and eosin (H&E) staining remains the gold standard for cancer diagnoses. However, the contrast in H&E-stained tissue specimens can be challenging to distinguish, necessitating more specific staining approaches. Immunohistochemistry (IHC) addresses this issue by employing antibodies that bind specifically to antigens in biological tissues. However, IHC is time-consuming, expensive, and labor-intensive. A novel deep-learning-based approach is proposed, using a conditional diffusion-based model to generate virtually IHC-stained images from H&E images. The state-of-the-art methods address this image-to-image translation task by formulating it as a problem in generative adversarial networks (GANs), however, our proposed method demonstrates improved performance due to its stable training process. The results on a benchmark dataset show that our proposed method can overcome the limitations of the state-of-the-art staining methods such as CycleGAN and pix2pix with improved PSNR, SSIM and FID scores and closer visual quality to the ground truth IHC images.
AB - The development of advanced image-generative modalities has significantly improved digitalized histopathological diagnostics. Despite its limitations, hematoxylin and eosin (H&E) staining remains the gold standard for cancer diagnoses. However, the contrast in H&E-stained tissue specimens can be challenging to distinguish, necessitating more specific staining approaches. Immunohistochemistry (IHC) addresses this issue by employing antibodies that bind specifically to antigens in biological tissues. However, IHC is time-consuming, expensive, and labor-intensive. A novel deep-learning-based approach is proposed, using a conditional diffusion-based model to generate virtually IHC-stained images from H&E images. The state-of-the-art methods address this image-to-image translation task by formulating it as a problem in generative adversarial networks (GANs), however, our proposed method demonstrates improved performance due to its stable training process. The results on a benchmark dataset show that our proposed method can overcome the limitations of the state-of-the-art staining methods such as CycleGAN and pix2pix with improved PSNR, SSIM and FID scores and closer visual quality to the ground truth IHC images.
KW - Conditional Diffusion
KW - IHC Virtual Staining
KW - Image Translation
UR - http://www.scopus.com/inward/record.url?scp=105005655721&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=105005655721&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-88220-3_14
DO - 10.1007/978-3-031-88220-3_14
M3 - Conference proceeding (ISBN)
AN - SCOPUS:105005655721
SN - 9783031882197
T3 - Lecture Notes in Computer Science
SP - 193
EP - 207
BT - Pattern Recognition. ICPR 2024 International Workshops and Challenges, 2024, Proceedings
A2 - Palaiahnakote, Shivakumara
A2 - Schuckers, Stephanie
A2 - Ogier, Jean-Marc
A2 - Bhattacharya, Prabir
A2 - Pal, Umapada
A2 - Bhattacharya, Saumik
PB - Springer Science and Business Media Deutschland GmbH
Y2 - 1 December 2024 through 1 December 2024
ER -