» Articles » PMID: 38773670

HCA-DAN: Hierarchical Class-aware Domain Adaptive Network for Gastric Tumor Segmentation in 3D CT Images

Overview
Journal Cancer Imaging
Publisher Springer Nature
Specialties Oncology
Radiology
Date 2024 May 22
PMID 38773670
Authors
Affiliations
Soon will be listed here.
Abstract

Background: Accurate segmentation of gastric tumors from CT scans provides useful image information for guiding the diagnosis and treatment of gastric cancer. However, automated gastric tumor segmentation from 3D CT images faces several challenges. The large variation of anisotropic spatial resolution limits the ability of 3D convolutional neural networks (CNNs) to learn features from different views. The background texture of gastric tumor is complex, and its size, shape and intensity distribution are highly variable, which makes it more difficult for deep learning methods to capture the boundary. In particular, while multi-center datasets increase sample size and representation ability, they suffer from inter-center heterogeneity.

Methods: In this study, we propose a new cross-center 3D tumor segmentation method named Hierarchical Class-Aware Domain Adaptive Network (HCA-DAN), which includes a new 3D neural network that efficiently bridges an Anisotropic neural network and a Transformer (AsTr) for extracting multi-scale context features from the CT images with anisotropic resolution, and a hierarchical class-aware domain alignment (HCADA) module for adaptively aligning multi-scale context features across two domains by integrating a class attention map with class-specific information. We evaluate the proposed method on an in-house CT image dataset collected from four medical centers and validate its segmentation performance in both in-center and cross-center test scenarios.

Results: Our baseline segmentation network (i.e., AsTr) achieves best results compared to other 3D segmentation models, with a mean dice similarity coefficient (DSC) of 59.26%, 55.97%, 48.83% and 67.28% in four in-center test tasks, and with a DSC of 56.42%, 55.94%, 46.54% and 60.62% in four cross-center test tasks. In addition, the proposed cross-center segmentation network (i.e., HCA-DAN) obtains excellent results compared to other unsupervised domain adaptation methods, with a DSC of 58.36%, 56.72%, 49.25%, and 62.20% in four cross-center test tasks.

Conclusions: Comprehensive experimental results demonstrate that the proposed method outperforms compared methods on this multi-center database and is promising for routine clinical workflows.

Citing Articles

Next-Gen Medical Imaging: U-Net Evolution and the Rise of Transformers.

Zhang C, Deng X, Ling S Sensors (Basel). 2024; 24(14).

PMID: 39066065 PMC: 11280776. DOI: 10.3390/s24144668.

References
1.
Dong D, Tang L, Li Z, Fang M, Gao J, Shan X . Development and validation of an individualized nomogram to identify occult peritoneal metastasis in patients with advanced gastric cancer. Ann Oncol. 2019; 30(3):431-438. PMC: 6442651. DOI: 10.1093/annonc/mdz001. View

2.
Yasaka K, Akai H, Abe O, Kiryu S . Deep Learning with Convolutional Neural Network for Differentiation of Liver Masses at Dynamic Contrast-enhanced CT: A Preliminary Study. Radiology. 2017; 286(3):887-896. DOI: 10.1148/radiol.2017170706. View

3.
Ajani J, DAmico T, Almhanna K, Bentrem D, Chao J, Das P . Gastric Cancer, Version 3.2016, NCCN Clinical Practice Guidelines in Oncology. J Natl Compr Canc Netw. 2016; 14(10):1286-1312. DOI: 10.6004/jnccn.2016.0137. View

4.
Heller N, Isensee F, Maier-Hein K, Hou X, Xie C, Li F . The state of the art in kidney and kidney tumor segmentation in contrast-enhanced CT imaging: Results of the KiTS19 challenge. Med Image Anal. 2020; 67:101821. PMC: 7734203. DOI: 10.1016/j.media.2020.101821. View

5.
Wang Y, Liu W, Yu Y, Liu J, Jiang L, Xue H . Prediction of the Depth of Tumor Invasion in Gastric Cancer: Potential Role of CT Radiomics. Acad Radiol. 2019; 27(8):1077-1084. DOI: 10.1016/j.acra.2019.10.020. View