» Articles » PMID: 39728907

CNN-Based Cross-Modality Fusion for Enhanced Breast Cancer Detection Using Mammography and Ultrasound

Overview
Journal Tomography
Publisher MDPI
Specialty Radiology
Date 2024 Dec 27
PMID 39728907
Authors
Affiliations
Soon will be listed here.
Abstract

Breast cancer is a leading cause of mortality among women in Taiwan and globally. Non-invasive imaging methods, such as mammography and ultrasound, are critical for early detection, yet standalone modalities have limitations in regard to their diagnostic accuracy. This study aims to enhance breast cancer detection through a cross-modality fusion approach combining mammography and ultrasound imaging, using advanced convolutional neural network (CNN) architectures. Breast images were sourced from public datasets, including the RSNA, the PAS, and Kaggle, and categorized into malignant and benign groups. Data augmentation techniques were used to address imbalances in the ultrasound dataset. Three models were developed: (1) pre-trained CNNs integrated with machine learning classifiers, (2) transfer learning-based CNNs, and (3) a custom-designed 17-layer CNN for direct classification. The performance of the models was evaluated using metrics such as accuracy and the Kappa score. The custom 17-layer CNN outperformed the other models, achieving an accuracy of 0.964 and a Kappa score of 0.927. The transfer learning model achieved moderate performance (accuracy 0.846, Kappa 0.694), while the pre-trained CNNs with machine learning classifiers yielded the lowest results (accuracy 0.780, Kappa 0.559). Cross-modality fusion proved effective in leveraging the complementary strengths of mammography and ultrasound imaging. This study demonstrates the potential of cross-modality imaging and tailored CNN architectures to significantly improve diagnostic accuracy and reliability in breast cancer detection. The custom-designed model offers a practical solution for early detection, potentially reducing false positives and false negatives, and improving patient outcomes through timely and accurate diagnosis.

References
1.
Jones M, Sadeghipour N, Chen X, Islam W, Zheng B . A multi-stage fusion framework to classify breast lesions using deep learning and radiomics features computed from four-view mammograms. Med Phys. 2023; 50(12):7670-7683. PMC: 10589387. DOI: 10.1002/mp.16419. View

2.
Pawar S, Sharma K, Sapate S, Yadav G, Alroobaea R, Alzahrani S . Multichannel DenseNet Architecture for Classification of Mammographic Breast Density for Breast Cancer Detection. Front Public Health. 2022; 10:885212. PMC: 9081505. DOI: 10.3389/fpubh.2022.885212. View

3.
Reshma V, Arya N, Ahmad S, Wattar I, Mekala S, Joshi S . Detection of Breast Cancer Using Histopathological Image Classification Dataset with Deep Learning Techniques. Biomed Res Int. 2022; 2022:8363850. PMC: 8913119. DOI: 10.1155/2022/8363850. View

4.
Balkenende L, Teuwen J, Mann R . Application of Deep Learning in Breast Cancer Imaging. Semin Nucl Med. 2022; 52(5):584-596. DOI: 10.1053/j.semnuclmed.2022.02.003. View

5.
Baccouche A, Garcia-Zapirain B, Elmaghraby A . An integrated framework for breast mass classification and diagnosis using stacked ensemble of residual neural networks. Sci Rep. 2022; 12(1):12259. PMC: 9293883. DOI: 10.1038/s41598-022-15632-6. View