» Articles » PMID: 33322231

Whole and Part Adaptive Fusion Graph Convolutional Networks for Skeleton-Based Action Recognition

Overview
Journal Sensors (Basel)
Publisher MDPI
Specialty Biotechnology
Date 2020 Dec 16
PMID 33322231
Citations 4
Authors
Affiliations
Soon will be listed here.
Abstract

Spatiotemporal graph convolution has made significant progress in skeleton-based action recognition in recent years. Most of the existing graph convolution methods take all the joints of the human skeleton as the overall modeling graph, ignoring the differences in the movement patterns of various parts of the human, and cannot well connect the relationship between the different parts of the human skeleton. To capture the unique features of different parts of human skeleton data and the correlation of different parts, we propose two new graph convolution methods: the whole graph convolution network (WGCN) and the part graph convolution network (PGCN). WGCN learns the whole scale skeleton spatiotemporal features according to the movement patterns and physical structure of the human skeleton. PGCN divides the human skeleton graph into several subgraphs to learn the part scale spatiotemporal features. Moreover, we propose an adaptive fusion module that combines the two features for multiple complementary adaptive fusion to obtain more effective skeleton features. By coupling these proposals, we build a whole and part adaptive fusion graph convolution neural network (WPGCN) that outperforms previous state-of-the-art methods on three large-scale datasets: NTU RGB+D 60, NTU RGB+D 120, and Kinetics Skeleton 400.

Citing Articles

Susceptibility-Weighted MRI for Predicting NF-2 Mutations and S100 Protein Expression in Meningiomas.

Azamat S, Buz-Yalug B, Dindar S, Yilmaz Tan K, Ozcan A, Can O Diagnostics (Basel). 2024; 14(7).

PMID: 38611661 PMC: 11012050. DOI: 10.3390/diagnostics14070748.


Skeleton Graph-Neural-Network-Based Human Action Recognition: A Survey.

Feng M, Meunier J Sensors (Basel). 2022; 22(6).

PMID: 35336262 PMC: 8952863. DOI: 10.3390/s22062091.


A Hybrid Deep Learning Model for Recognizing Actions of Distracted Drivers.

Jiao S, Liu L, Liu Q Sensors (Basel). 2021; 21(21).

PMID: 34770728 PMC: 8588220. DOI: 10.3390/s21217424.


ASNet: Auto-Augmented Siamese Neural Network for Action Recognition.

Zhang Y, Po L, Xiong J, Rehman Y, Cheung K Sensors (Basel). 2021; 21(14).

PMID: 34300460 PMC: 8309510. DOI: 10.3390/s21144720.

References
1.
Liu J, Shahroudy A, Perez M, Wang G, Duan L, Kot A . NTU RGB+D 120: A Large-Scale Benchmark for 3D Human Activity Understanding. IEEE Trans Pattern Anal Mach Intell. 2019; 42(10):2684-2701. DOI: 10.1109/TPAMI.2019.2916873. View

2.
Chan W, Tian Z, Wu Y . GAS-GCN: Gated Action-Specific Graph Convolutional Networks for Skeleton-Based Action Recognition. Sensors (Basel). 2020; 20(12). PMC: 7349730. DOI: 10.3390/s20123499. View

3.
Li F, Li J, Zhu A, Xu Y, Yin H, Hua G . Enhanced Spatial and Extended Temporal Graph Convolutional Network for Skeleton-Based Action Recognition. Sensors (Basel). 2020; 20(18). PMC: 7571203. DOI: 10.3390/s20185260. View

4.
Shi L, Zhang Y, Cheng J, Lu H . Skeleton-Based Action Recognition with Multi-Stream Adaptive Graph Convolutional Networks. IEEE Trans Image Process. 2020; PP. DOI: 10.1109/TIP.2020.3028207. View