» Articles » PMID: 34947098

Swin-UNet++: A Nested Swin Transformer Architecture for Location Identification and Morphology Segmentation of Dimples on 2.25Cr1Mo0.25V Fractured Surface

Overview
Publisher MDPI
Date 2021 Dec 24
PMID 34947098
Citations 4
Authors
Affiliations
Soon will be listed here.
Abstract

The precise identification of micro-features on 2.25Cr1Mo0.25V steel is of great significance for understanding the mechanism of hydrogen embrittlement (HE) and evaluating the alloy's properties of HE resistance. Presently, the convolution neural network (CNN) of deep learning is widely applied in the micro-features identification of alloy. However, with the development of the transformer in image recognition, the transformer-based neural network performs better on the learning of global and long-range semantic information than CNN and achieves higher prediction accuracy. In this work, a new transformer-based neural network model Swin-UNet++ was proposed. Specifically, the architecture of the decoder was redesigned to more precisely detect and identify the micro-feature with complex morphology (i.e., dimples) of 2.25Cr1Mo0.25V steel fracture surface. Swin-UNet++ and other segmentation models performed state-of-the-art (SOTA) were compared on the dimple dataset constructed in this work, which consists of 830 dimple scanning electron microscopy (SEM) images on 2.25Cr1Mo0.25V steel fracture surface. The segmentation results show Swin-UNet++ not only realizes the accurate identification of dimples but displays a much higher prediction accuracy and stronger robustness than Swin-Unet and UNet. Moreover, efforts from this work will also provide an important reference value to the identification of other micro-features with complex morphologies.

Citing Articles

Automatic segmentation of esophageal cancer, metastatic lymph nodes and their adjacent structures in CTA images based on the UperNet Swin network.

Wang R, Chen X, Zhang X, He P, Ma J, Cui H Cancer Med. 2024; 13(18):e70188.

PMID: 39300922 PMC: 11413407. DOI: 10.1002/cam4.70188.


Automatic substantia nigra segmentation with Swin-Unet in susceptibility- and T2-weighted imaging: application to Parkinson disease diagnosis.

Wang T, Wang Y, Zhu H, Liu Z, Chen Y, Wang L Quant Imaging Med Surg. 2024; 14(9):6337-6351.

PMID: 39281181 PMC: 11400694. DOI: 10.21037/qims-24-27.


Transformer-Based Deep Learning Network for Tooth Segmentation on Panoramic Radiographs.

Sheng C, Wang L, Huang Z, Wang T, Guo Y, Hou W J Syst Sci Complex. 2022; 36(1):257-272.

PMID: 36258771 PMC: 9561331. DOI: 10.1007/s11424-022-2057-9.


Label Noise Learning Method for Metallographic Image Recognition of Heat-Resistant Steel for Use in Pressure Equipment.

Shen Z, Hu H, Huang Z, Zhang Y, Wang Y, Li X Materials (Basel). 2022; 15(19).

PMID: 36234378 PMC: 9572554. DOI: 10.3390/ma15197037.

References
1.
Silva J, Histace A, Romain O, Dray X, Granado B . Toward embedded detection of polyps in WCE images for early diagnosis of colorectal cancer. Int J Comput Assist Radiol Surg. 2013; 9(2):283-93. DOI: 10.1007/s11548-013-0926-3. View

2.
Macek W, Branco R, Szala M, Marciniak Z, Ulewicz R, Sczygiol N . Profile and Areal Surface Parameters for Fatigue Fracture Characterisation. Materials (Basel). 2020; 13(17). PMC: 7504328. DOI: 10.3390/ma13173691. View

3.
Guo Z, Guo N, Gong K, Zhong S, Li Q . Gross tumor volume segmentation for head and neck cancer radiotherapy using deep dense multi-modality network. Phys Med Biol. 2019; 64(20):205015. PMC: 7186044. DOI: 10.1088/1361-6560/ab440d. View

4.
Konovalenko I, Maruschak P, Brezinova J, Brezina J . Morphological Characteristics of Dimples of Ductile Fracture of VT23M Titanium Alloy and Identification of Dimples on Fractograms of Different Scale. Materials (Basel). 2019; 12(13). PMC: 6651790. DOI: 10.3390/ma12132051. View

5.
Zhou Z, Siddiquee M, Tajbakhsh N, Liang J . UNet++: A Nested U-Net Architecture for Medical Image Segmentation. Deep Learn Med Image Anal Multimodal Learn Clin Decis Support (2018). 2020; 11045:3-11. PMC: 7329239. DOI: 10.1007/978-3-030-00889-5_1. View