» Articles » PMID: 38446042

Development and Validation of a Deep Learning Model to Reduce the Interference of Rectal Artifacts in MRI-based Prostate Cancer Diagnosis

Abstract

Purpose To develop an MRI-based model for clinically significant prostate cancer (csPCa) diagnosis that can resist rectal artifact interference. Materials and Methods This retrospective study included 2203 male patients with prostate lesions who underwent biparametric MRI and biopsy between January 2019 and June 2023. Targeted adversarial training with proprietary adversarial samples (TPAS) strategy was proposed to enhance model resistance against rectal artifacts. The automated csPCa diagnostic models trained with and without TPAS were compared using multicenter validation datasets. The impact of rectal artifacts on the diagnostic performance of each model at the patient and lesion levels was compared using the area under the receiver operating characteristic curve (AUC) and the area under the precision-recall curve (AUPRC). The AUC between models was compared using the DeLong test, and the AUPRC was compared using the bootstrap method. Results The TPAS model exhibited diagnostic performance improvements of 6% at the patient level (AUC: 0.87 vs 0.81, < .001) and 7% at the lesion level (AUPRC: 0.84 vs 0.77, = .007) compared with the control model. The TPAS model demonstrated less performance decline in the presence of rectal artifact-pattern adversarial noise than the control model (ΔAUC: -17% vs -19%, ΔAUPRC: -18% vs -21%). The TPAS model performed better than the control model in patients with moderate (AUC: 0.79 vs 0.73, AUPRC: 0.68 vs 0.61) and severe (AUC: 0.75 vs 0.57, AUPRC: 0.69 vs 0.59) artifacts. Conclusion This study demonstrates that the TPAS model can reduce rectal artifact interference in MRI-based csPCa diagnosis, thereby improving its performance in clinical applications. MR-Diffusion-weighted Imaging, Urinary, Prostate, Comparative Studies, Diagnosis, Transfer Learning Clinical trial registration no. ChiCTR23000069832 Published under a CC BY 4.0 license.

Citing Articles

Comparison of the impact of rectal susceptibility artifacts in prostate magnetic resonance imaging on subjective evaluation and deep learning: a two-center retrospective study.

Wang Z, Lu P, Liu S, Fu C, Ye Y, Yu C BMC Med Imaging. 2025; 25(1):61.

PMID: 40000986 PMC: 11863642. DOI: 10.1186/s12880-025-01602-7.


BEEx Is an Open-Source Tool That Evaluates Batch Effects in Medical Images to Enable Multicenter Studies.

Wu Y, Xu X, Cheng Y, Zhang X, Liu F, Li Z Cancer Res. 2024; 85(2):218-230.

PMID: 39661030 PMC: 11735318. DOI: 10.1158/0008-5472.CAN-23-3846.

References
1.
Hu L, Zhou D, Guo X, Xu W, Wei L, Zhao J . Adversarial training for prostate cancer classification using magnetic resonance imaging. Quant Imaging Med Surg. 2022; 12(6):3276-3287. PMC: 9131330. DOI: 10.21037/qims-21-1089. View

2.
Deniffel D, Abraham N, Namdar K, Dong X, Salinas E, Milot L . Using decision curve analysis to benchmark performance of a magnetic resonance imaging-based deep learning model for prostate cancer risk assessment. Eur Radiol. 2020; 30(12):6867-6876. DOI: 10.1007/s00330-020-07030-1. View

3.
Engels R, Israel B, Padhani A, Barentsz J . Multiparametric Magnetic Resonance Imaging for the Detection of Clinically Significant Prostate Cancer: What Urologists Need to Know. Part 1: Acquisition. Eur Urol. 2019; 77(4):457-468. DOI: 10.1016/j.eururo.2019.09.021. View

4.
Khosravi P, Lysandrou M, Eljalby M, Li Q, Kazemi E, Zisimopoulos P . A Deep Learning Approach to Diagnostic Classification of Prostate Cancer Using Pathology-Radiology Fusion. J Magn Reson Imaging. 2021; 54(2):462-471. PMC: 8360022. DOI: 10.1002/jmri.27599. View

5.
Hu L, Fu C, Song X, Grimm R, von Busch H, Benkert T . Automated deep-learning system in the assessment of MRI-visible prostate cancer: comparison of advanced zoomed diffusion-weighted imaging and conventional technique. Cancer Imaging. 2023; 23(1):6. PMC: 9843860. DOI: 10.1186/s40644-023-00527-0. View