» Articles » PMID: 38138373

Biosensor-Based Multimodal Deep Human Locomotion Decoding Via Internet of Healthcare Things

Overview
Publisher MDPI
Date 2023 Dec 23
PMID 38138373
Authors
Affiliations
Soon will be listed here.
Abstract

Multiple Internet of Healthcare Things (IoHT)-based devices have been utilized as sensing methodologies for human locomotion decoding to aid in applications related to e-healthcare. Different measurement conditions affect the daily routine monitoring, including the sensor type, wearing style, data retrieval method, and processing model. Currently, several models are present in this domain that include a variety of techniques for pre-processing, descriptor extraction, and reduction, along with the classification of data captured from multiple sensors. However, such models consisting of multiple subject-based data using different techniques may degrade the accuracy rate of locomotion decoding. Therefore, this study proposes a deep neural network model that not only applies the state-of-the-art Quaternion-based filtration technique for motion and ambient data along with background subtraction and skeleton modeling for video-based data, but also learns important descriptors from novel graph-based representations and Gaussian Markov random-field mechanisms. Due to the non-linear nature of data, these descriptors are further utilized to extract the codebook via the Gaussian mixture regression model. Furthermore, the codebook is provided to the recurrent neural network to classify the activities for the locomotion-decoding system. We show the validity of the proposed model across two publicly available data sampling strategies, namely, the HWU-USP and LARa datasets. The proposed model is significantly improved over previous systems, as it achieved 82.22% and 82.50% for the HWU-USP and LARa datasets, respectively. The proposed IoHT-based locomotion-decoding model is useful for unobtrusive human activity recognition over extended periods in e-healthcare facilities.

Citing Articles

Editorial for the Special Issue on Exploring IoT Sensors and Their Applications: Advancements, Challenges, and Opportunities in Smart Environments.

Jing L, Matsumoto Y, Zhang Z Micromachines (Basel). 2024; 15(8).

PMID: 39203699 PMC: 11356150. DOI: 10.3390/mi15081048.

References
1.
Wang W, Qi F, Wipf D, Cai C, Yu T, Li Y . Sparse Bayesian Learning for End-to-End EEG Decoding. IEEE Trans Pattern Anal Mach Intell. 2023; 45(12):15632-15649. DOI: 10.1109/TPAMI.2023.3299568. View

2.
Chung S, Lim J, Noh K, Kim G, Jeong H . Sensor Data Acquisition and Multimodal Sensor Fusion for Human Activity Recognition Using Deep Learning. Sensors (Basel). 2019; 19(7). PMC: 6479605. DOI: 10.3390/s19071716. View

3.
Khatun M, Yousuf M, Ahmed S, Uddin M, Alyami S, Al-Ashhab S . Deep CNN-LSTM With Self-Attention Model for Human Activity Recognition Using Wearable Sensor. IEEE J Transl Eng Health Med. 2022; 10:2700316. PMC: 9252338. DOI: 10.1109/JTEHM.2022.3177710. View

4.
Cosoli G, Antognoli L, Scalise L . Wearable Electrocardiography for Physical Activity Monitoring: Definition of Validation Protocol and Automatic Classification. Biosensors (Basel). 2023; 13(2). PMC: 9953541. DOI: 10.3390/bios13020154. View

5.
Ordonez F, Roggen D . Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition. Sensors (Basel). 2016; 16(1). PMC: 4732148. DOI: 10.3390/s16010115. View