» Articles » PMID: 39867569

MITIGATING OVER-SATURATED FLUORESCENCE IMAGES THROUGH A SEMI-SUPERVISED GENERATIVE ADVERSARIAL NETWORK

Abstract

Multiplex immunofluorescence (MxIF) imaging is a critical tool in biomedical research, offering detailed insights into cell composition and spatial context. As an example, DAPI staining identifies cell nuclei, while CD20 staining helps segment cell membranes in MxIF. However, a persistent challenge in MxIF is saturation artifacts, which hinder single-cell level analysis in areas with over-saturated pixels. Traditional gamma correction methods for fixing saturation are limited, often incorrectly assuming uniform distribution of saturation, which is rarely the case in practice. This paper introduces a novel approach to correct saturation artifacts from a data-driven perspective. We introduce a two-stage, high-resolution hybrid generative adversarial network (HDmixGAN), which merges unpaired (CycleGAN) and paired (pix2pixHD) network architectures. This approach is designed to capitalize on the available small-scale paired data and the more extensive unpaired data from costly MxIF data. Specifically, we generate pseudo-paired data from large-scale unpaired over-saturated datasets with a CycleGAN, and train a Pix2pixGAN using both small-scale real and large-scale synthetic data derived from multiple DAPI staining rounds in MxIF. This method was validated against various baselines in a downstream nuclei detection task, improving the F1 score by 6% over the baseline. This is, to our knowledge, the first focused effort to address multi-round saturation in MxIF images, offering a specialized solution for enhancing cell analysis accuracy through improved image quality. The source code and implementation of the proposed method are available at https://github.com/MASILab/DAPIArtifactRemoval.git.

References
1.
Bao S, Lee H, Yang Q, Remedios L, Deng R, Cui C . Alleviating tiling effect by random walk sliding window in high-resolution histological whole slide image synthesis. Proc Mach Learn Res. 2024; 227:1406-1422. PMC: 11238901. View

2.
Wahlstrand Skarstrom V, Krona A, Loren N, Roding M . DeepFRAP: Fast fluorescence recovery after photobleaching data analysis using deep neural networks. J Microsc. 2020; 282(2):146-161. PMC: 8248438. DOI: 10.1111/jmi.12989. View

3.
Greenwald N, Miller G, Moen E, Kong A, Kagel A, Dougherty T . Whole-cell segmentation of tissue images with human-level performance using large-scale data annotation and deep learning. Nat Biotechnol. 2021; 40(4):555-565. PMC: 9010346. DOI: 10.1038/s41587-021-01094-0. View

4.
Huo Y, Xu Z, Moon H, Bao S, Assad A, Moyo T . SynSeg-Net: Synthetic Segmentation Without Target Modality Ground Truth. IEEE Trans Med Imaging. 2018; . PMC: 6504618. DOI: 10.1109/TMI.2018.2876633. View

5.
Gerdes M, Sevinsky C, Sood A, Adak S, Bello M, Bordwell A . Highly multiplexed single-cell analysis of formalin-fixed, paraffin-embedded cancer tissue. Proc Natl Acad Sci U S A. 2013; 110(29):11982-7. PMC: 3718135. DOI: 10.1073/pnas.1300136110. View