» Articles » PMID: 29186887

IMU-Based Gait Recognition Using Convolutional Neural Networks and Multi-Sensor Fusion

Overview
Journal Sensors (Basel)
Publisher MDPI
Specialty Biotechnology
Date 2017 Dec 1
PMID 29186887
Citations 40
Authors
Affiliations
Soon will be listed here.
Abstract

The wide spread usage of wearable sensors such as in smart watches has provided continuous access to valuable user generated data such as human motion that could be used to identify an individual based on his/her motion patterns such as, gait. Several methods have been suggested to extract various heuristic and high-level features from gait motion data to identify discriminative gait signatures and distinguish the target individual from others. However, the manual and hand crafted feature extraction is error prone and subjective. Furthermore, the motion data collected from inertial sensors have complex structure and the detachment between manual feature extraction module and the predictive learning models might limit the generalization capabilities. In this paper, we propose a novel approach for human gait identification using time-frequency (TF) expansion of human gait cycles in order to capture joint 2 dimensional (2D) spectral and temporal patterns of gait cycles. Then, we design a deep convolutional neural network (DCNN) learning to extract discriminative features from the 2D expanded gait cycles and jointly optimize the identification model and the spectro-temporal features in a discriminative fashion. We collect raw motion data from five inertial sensors placed at the chest, lower-back, right hand wrist, right knee, and right ankle of each human subject synchronously in order to investigate the impact of sensor location on the gait identification performance. We then present two methods for early (input level) and late (decision score level) multi-sensor fusion to improve the gait identification generalization performance. We specifically propose the minimum error score fusion (MESF) method that discriminatively learns the linear fusion weights of individual DCNN scores at the decision level by minimizing the error rate on the training data in an iterative manner. 10 subjects participated in this study and hence, the problem is a 10-class identification task. Based on our experimental results, 91% subject identification accuracy was achieved using the best individual IMU and 2DTF-DCNN. We then investigated our proposed early and late sensor fusion approaches, which improved the gait identification accuracy of the system to 93.36% and 97.06%, respectively.

Citing Articles

A method for blood pressure hydrostatic pressure correction using wearable inertial sensors and deep learning.

Colburn D, Chern T, Guo V, Salamat K, Pugliese D, Bradley C NPJ Biosens. 2025; 2(1):5.

PMID: 39897702 PMC: 11785522. DOI: 10.1038/s44328-024-00021-y.


Action Recognition in Basketball with Inertial Measurement Unit-Supported Vest.

Sonalcan H, Bilen E, Ates B, Seckin A Sensors (Basel). 2025; 25(2).

PMID: 39860933 PMC: 11769260. DOI: 10.3390/s25020563.


Leveraging Sensor Technology to Characterize the Postural Control Spectrum.

Aliperti C, Steckenrider J, Sattari D, Peterson J, Bell C, Zifchock R Sensors (Basel). 2024; 24(23).

PMID: 39685957 PMC: 11644092. DOI: 10.3390/s24237420.


Older Adult Fall Risk Prediction with Deep Learning and Timed Up and Go (TUG) Test Data.

Maiora J, Rezola-Pardo C, Garcia G, Sanz B, Grana M Bioengineering (Basel). 2024; 11(10).

PMID: 39451376 PMC: 11504430. DOI: 10.3390/bioengineering11101000.


Spatial and temporal attention embedded spatial temporal graph convolutional networks for skeleton based gait recognition with multiple IMUs.

Yan J, Xiong W, Jin L, Jiang J, Yang Z, Hu S iScience. 2024; 27(9):110646.

PMID: 39280595 PMC: 11402213. DOI: 10.1016/j.isci.2024.110646.


References
1.
Sposaro F, Tyson G . iFall: an Android application for fall monitoring and response. Annu Int Conf IEEE Eng Med Biol Soc. 2009; 2009:6119-22. DOI: 10.1109/IEMBS.2009.5334912. View

2.
Yurtman A, Barshan B . Activity Recognition Invariant to Sensor Orientation with Wearable Motion Sensors. Sensors (Basel). 2017; 17(8). PMC: 5579846. DOI: 10.3390/s17081838. View

3.
Kale A, Sundaresan A, Rajagopalan A, Cuntoor N, Roy-Chowdhury A, Kruger V . Identification of humans using gait. IEEE Trans Image Process. 2004; 13(9):1163-73. DOI: 10.1109/tip.2004.832865. View

4.
Murray M, DROUGHT A, KORY R . WALKING PATTERNS OF NORMAL MEN. J Bone Joint Surg Am. 1964; 46:335-60. View

5.
Nutt J, Marsden C, Thompson P . Human walking and higher-level gait disorders, particularly in the elderly. Neurology. 1993; 43(2):268-79. DOI: 10.1212/wnl.43.2.268. View