» Articles » PMID: 30945385

Deep Convolutional Neural Networks with Multiplane Consensus Labeling for Lung Function Quantification Using UTE Proton MRI

Overview
Date 2019 Apr 5
PMID 30945385
Citations 16
Authors
Affiliations
Soon will be listed here.
Abstract

Background: Ultrashort echo time (UTE) proton MRI has gained popularity for assessing lung structure and function in pulmonary imaging; however, the development of rapid biomarker extraction and regional quantification has lagged behind due to labor-intensive lung segmentation.

Purpose: To evaluate a deep learning (DL) approach for automated lung segmentation to extract image-based biomarkers from functional lung imaging using 3D radial UTE oxygen-enhanced (OE) MRI.

Study Type: Retrospective study aimed to evaluate a technical development.

Population: Forty-five human subjects, including 16 healthy volunteers, 5 asthma, and 24 patients with cystic fibrosis.

Field Strength/sequence: 1.5T MRI, 3D radial UTE (TE = 0.08 msec) sequence.

Assessment: Two 3D radial UTE volumes were acquired sequentially under normoxic (21% O ) and hyperoxic (100% O ) conditions. Automated segmentation of the lungs using 2D convolutional encoder-decoder based DL method, and the subsequent functional quantification via adaptive K-means were compared with the results obtained from the reference method, supervised region growing.

Statistical Tests: Relative to the reference method, the performance of DL on volumetric quantification was assessed using Dice coefficient with 95% confidence interval (CI) for accuracy, two-sided Wilcoxon signed-rank test for computation time, and Bland-Altman analysis on the functional measure derived from the OE images.

Results: The DL method produced strong agreement with supervised region growing for the right (Dice: 0.97; 95% CI = [0.96, 0.97]; P < 0.001) and left lungs (Dice: 0.96; 95% CI = [0.96, 0.97]; P < 0.001). The DL method averaged 46 seconds to generate the automatic segmentations in contrast to 1.93 hours using the reference method (P < 0.001). Bland-Altman analysis showed nonsignificant intermethod differences of volumetric (P ≥ 0.12) and functional measurements (P ≥ 0.34) in the left and right lungs.

Data Conclusion: DL provides rapid, automated, and robust lung segmentation for quantification of regional lung function using UTE proton MRI.

Level Of Evidence: 2 Technical Efficacy: Stage 1 J. Magn. Reson. Imaging 2019;50:1169-1181.

Citing Articles

Automated lung segmentation on chest MRI in children with cystic fibrosis.

Ringwald F, Wucherpfennig L, Hagen N, Mucke J, Kaletta S, Eichinger M Front Med (Lausanne). 2024; 11:1401473.

PMID: 39606627 PMC: 11600534. DOI: 10.3389/fmed.2024.1401473.


Automatic lung segmentation of magnetic resonance images: A new approach applied to healthy volunteers undergoing enhanced Deep-Inspiration-Breath-Hold for motion-mitigated 4D proton therapy of lung tumors.

Missimer J, Emert F, Lomax A, Weber D Phys Imaging Radiat Oncol. 2024; 29:100531.

PMID: 38292650 PMC: 10825631. DOI: 10.1016/j.phro.2024.100531.


Automated MRI Lung Segmentation and 3D Morphologic Features for Quantification of Neonatal Lung Disease.

Mairhormann B, Castelblanco A, Hafner F, Koliogiannis V, Haist L, Winter D Radiol Artif Intell. 2023; 5(6):e220239.

PMID: 38074782 PMC: 10698600. DOI: 10.1148/ryai.220239.


Combining neural networks and image synthesis to enable automatic thoracic cavity segmentation of hyperpolarized Xe MRI without proton scans.

Leewiwatwong S, Lu J, Dummer I, Yarnall K, Mummy D, Wang Z Magn Reson Imaging. 2023; 103:145-155.

PMID: 37406744 PMC: 10528669. DOI: 10.1016/j.mri.2023.07.001.


Implementable Deep Learning for Multi-sequence Proton MRI Lung Segmentation: A Multi-center, Multi-vendor, and Multi-disease Study.

Astley J, Biancardi A, Hughes P, Marshall H, Collier G, Chan H J Magn Reson Imaging. 2023; 58(4):1030-1044.

PMID: 36799341 PMC: 10946727. DOI: 10.1002/jmri.28643.


References
1.
Zhao G, Liu F, Oler J, Meyerand M, Kalin N, Birn R . Bayesian convolutional neural network based MRI brain extraction on nonhuman primates. Neuroimage. 2018; 175:32-44. PMC: 6095475. DOI: 10.1016/j.neuroimage.2018.03.065. View

2.
Zha W, Schiros C, Reddy G, Feng W, Denney Jr T, Lloyd S . Improved Right Ventricular Performance with Increased Tricuspid Annular Excursion in Athlete's Heart. Front Cardiovasc Med. 2015; 2:8. PMC: 4671336. DOI: 10.3389/fcvm.2015.00008. View

3.
Liu F, Zhou Z, Jang H, Samsonov A, Zhao G, Kijowski R . Deep convolutional neural network and 3D deformable approach for tissue segmentation in musculoskeletal magnetic resonance imaging. Magn Reson Med. 2017; 79(4):2379-2391. PMC: 6271435. DOI: 10.1002/mrm.26841. View

4.
Brosch T, Tang L, Yoo Y, Li D, Traboulsee A, Tam R . Deep 3D Convolutional Encoder Networks With Shortcuts for Multiscale Feature Integration Applied to Multiple Sclerosis Lesion Segmentation. IEEE Trans Med Imaging. 2016; 35(5):1229-1239. DOI: 10.1109/TMI.2016.2528821. View

5.
Tajbakhsh N, Shin J, Gurudu S, Hurst R, Kendall C, Gotway M . Convolutional Neural Networks for Medical Image Analysis: Full Training or Fine Tuning?. IEEE Trans Med Imaging. 2016; 35(5):1299-1312. DOI: 10.1109/TMI.2016.2535302. View