» Articles » PMID: 38449788

Robust Human Locomotion and Localization Activity Recognition over Multisensory

Overview
Journal Front Physiol
Date 2024 Mar 7
PMID 38449788
Authors
Affiliations
Soon will be listed here.
Abstract

Human activity recognition (HAR) plays a pivotal role in various domains, including healthcare, sports, robotics, and security. With the growing popularity of wearable devices, particularly Inertial Measurement Units (IMUs) and Ambient sensors, researchers and engineers have sought to take advantage of these advances to accurately and efficiently detect and classify human activities. This research paper presents an advanced methodology for human activity and localization recognition, utilizing smartphone IMU, Ambient, GPS, and Audio sensor data from two public benchmark datasets: the Opportunity dataset and the Extrasensory dataset. The Opportunity dataset was collected from 12 subjects participating in a range of daily activities, and it captures data from various body-worn and object-associated sensors. The Extrasensory dataset features data from 60 participants, including thousands of data samples from smartphone and smartwatch sensors, labeled with a wide array of human activities. Our study incorporates novel feature extraction techniques for signal, GPS, and audio sensor data. Specifically, for localization, GPS, audio, and IMU sensors are utilized, while IMU and Ambient sensors are employed for locomotion activity recognition. To achieve accurate activity classification, state-of-the-art deep learning techniques, such as convolutional neural networks (CNN) and long short-term memory (LSTM), have been explored. For indoor/outdoor activities, CNNs are applied, while LSTMs are utilized for locomotion activity recognition. The proposed system has been evaluated using the k-fold cross-validation method, achieving accuracy rates of 97% and 89% for locomotion activity over the Opportunity and Extrasensory datasets, respectively, and 96% for indoor/outdoor activity over the Extrasensory dataset. These results highlight the efficiency of our methodology in accurately detecting various human activities, showing its potential for real-world applications. Moreover, the research paper introduces a hybrid system that combines machine learning and deep learning features, enhancing activity recognition performance by leveraging the strengths of both approaches.

Citing Articles

Enhanced human activity recognition in medical emergencies using a hybrid deep CNN and bi-directional LSTM model with wearable sensors.

Chandramouli N, Natarajan S, Alharbi A, Kannan S, Khafaga D, Raju S Sci Rep. 2024; 14(1):30979.

PMID: 39730745 PMC: 11680769. DOI: 10.1038/s41598-024-82045-y.


Exponential stability analysis of delayed partial differential equation systems: Applying the Lyapunov method and delay-dependent techniques.

Tian H, Basem A, Kenjrawy H, Al-Rubaye A, Alfalahi S, Azarinfar H Heliyon. 2024; 10(12):e32650.

PMID: 39668990 PMC: 11637217. DOI: 10.1016/j.heliyon.2024.e32650.


Target detection and classification via EfficientDet and CNN over unmanned aerial vehicles.

Yusuf M, Hanzla M, Al Mudawi N, Sadiq T, Alabdullah B, Rahman H Front Neurorobot. 2024; 18:1448538.

PMID: 39280254 PMC: 11392906. DOI: 10.3389/fnbot.2024.1448538.


A robust multimodal detection system: physical exercise monitoring in long-term care environments.

Al Mudawi N, Batool M, Alazeb A, Alqahtani Y, Almujally N, Algarni A Front Bioeng Biotechnol. 2024; 12:1398291.

PMID: 39175622 PMC: 11338868. DOI: 10.3389/fbioe.2024.1398291.


Innovative healthcare solutions: robust hand gesture recognition of daily life routines using 1D CNN.

Al Mudawi N, Ansar H, Alazeb A, Aljuaid H, Alqahtani Y, Algarni A Front Bioeng Biotechnol. 2024; 12:1401803.

PMID: 39144478 PMC: 11322365. DOI: 10.3389/fbioe.2024.1401803.


References
1.
Li F, Shirahama K, Nisar M, Koping L, Grzegorzek M . Comparison of Feature Learning Methods for Human Activity Recognition Using Wearable Sensors. Sensors (Basel). 2018; 18(2). PMC: 5855052. DOI: 10.3390/s18020679. View

2.
Hu G, Zhang W, Wan H, Li X . Improving the Heading Accuracy in Indoor Pedestrian Navigation Based on a Decision Tree and Kalman Filter. Sensors (Basel). 2020; 20(6). PMC: 7146404. DOI: 10.3390/s20061578. View

3.
Surek G, Seman L, Stefenon S, Mariani V, Coelho L . Video-Based Human Activity Recognition Using Deep Learning Approaches. Sensors (Basel). 2023; 23(14). PMC: 10386633. DOI: 10.3390/s23146384. View

4.
Kamarudin K, Mamduh S, Shakaff A, Zakaria A . Performance analysis of the Microsoft Kinect sensor for 2D Simultaneous Localization and Mapping (SLAM) techniques. Sensors (Basel). 2014; 14(12):23365-87. PMC: 4299068. DOI: 10.3390/s141223365. View

5.
Leone A, Rescio G, Diraco G, Manni A, Siciliano P, Caroppo A . Ambient and Wearable Sensor Technologies for Energy Expenditure Quantification of Ageing Adults. Sensors (Basel). 2022; 22(13). PMC: 9269397. DOI: 10.3390/s22134893. View