» Articles » PMID: 38793886

Biosensor-Driven IoT Wearables for Accurate Body Motion Tracking and Localization

Overview
Journal Sensors (Basel)
Publisher MDPI
Specialty Biotechnology
Date 2024 May 25
PMID 38793886
Authors
Affiliations
Soon will be listed here.
Abstract

The domain of human locomotion identification through smartphone sensors is witnessing rapid expansion within the realm of research. This domain boasts significant potential across various sectors, including healthcare, sports, security systems, home automation, and real-time location tracking. Despite the considerable volume of existing research, the greater portion of it has primarily concentrated on locomotion activities. Comparatively less emphasis has been placed on the recognition of human localization patterns. In the current study, we introduce a system by facilitating the recognition of both human physical and location-based patterns. This system utilizes the capabilities of smartphone sensors to achieve its objectives. Our goal is to develop a system that can accurately identify different human physical and localization activities, such as walking, running, jumping, indoor, and outdoor activities. To achieve this, we perform preprocessing on the raw sensor data using a Butterworth filter for inertial sensors and a Median Filter for Global Positioning System (GPS) and then applying Hamming windowing techniques to segment the filtered data. We then extract features from the raw inertial and GPS sensors and select relevant features using the variance threshold feature selection method. The extrasensory dataset exhibits an imbalanced number of samples for certain activities. To address this issue, the permutation-based data augmentation technique is employed. The augmented features are optimized using the Yeo-Johnson power transformation algorithm before being sent to a multi-layer perceptron for classification. We evaluate our system using the K-fold cross-validation technique. The datasets used in this study are the Extrasensory and Sussex Huawei Locomotion (SHL), which contain both physical and localization activities. Our experiments demonstrate that our system achieves high accuracy with 96% and 94% over Extrasensory and SHL in physical activities and 94% and 91% over Extrasensory and SHL in the location-based activities, outperforming previous state-of-the-art methods in recognizing both types of activities.

Citing Articles

A Hybrid Approach for Sports Activity Recognition Using Key Body Descriptors and Hybrid Deep Learning Classifier.

Tayyab M, Alateyah S, Alnusayri M, Alatiyyah M, AlHammadi D, Jalal A Sensors (Basel). 2025; 25(2).

PMID: 39860811 PMC: 11769191. DOI: 10.3390/s25020441.


Target detection and classification via EfficientDet and CNN over unmanned aerial vehicles.

Yusuf M, Hanzla M, Al Mudawi N, Sadiq T, Alabdullah B, Rahman H Front Neurorobot. 2024; 18:1448538.

PMID: 39280254 PMC: 11392906. DOI: 10.3389/fnbot.2024.1448538.

References
1.
Hu Z, Ren L, Wei G, Qian Z, Liang W, Chen W . Energy Flow and Functional Behavior of Individual Muscles at Different Speeds During Human Walking. IEEE Trans Neural Syst Rehabil Eng. 2022; 31:294-303. DOI: 10.1109/TNSRE.2022.3221986. View

2.
Zhao Q, Yan S, Zhang B, Fan K, Zhang J, Li W . An On-Chip Viscoelasticity Sensor for Biological Fluids. Cyborg Bionic Syst. 2023; 4:0006. PMC: 10076049. DOI: 10.34133/cbsystems.0006. View

3.
Bashar S, Al Fahim A, Chon K . Smartphone Based Human Activity Recognition with Feature Selection and Dense Neural Network. Annu Int Conf IEEE Eng Med Biol Soc. 2020; 2020:5888-5891. DOI: 10.1109/EMBC44109.2020.9176239. View

4.
Wang K, Boonpratatong A, Chen W, Ren L, Wei G, Qian Z . The Fundamental Property of Human Leg During Walking: Linearity and Nonlinearity. IEEE Trans Neural Syst Rehabil Eng. 2023; 31:4871-4881. DOI: 10.1109/TNSRE.2023.3339801. View

5.
Hussain I, Jany R, Boyer R, Azad A, Alyami S, Park S . An Explainable EEG-Based Human Activity Recognition Model Using Machine-Learning Approach and LIME. Sensors (Basel). 2023; 23(17). PMC: 10490625. DOI: 10.3390/s23177452. View