» Articles » PMID: 24991942

A Depth Video Sensor-based Life-logging Human Activity Recognition System for Elderly Care in Smart Indoor Environments

Overview
Journal Sensors (Basel)
Publisher MDPI
Specialty Biotechnology
Date 2014 Jul 4
PMID 24991942
Citations 28
Authors
Affiliations
Soon will be listed here.
Abstract

Recent advancements in depth video sensors technologies have made human activity recognition (HAR) realizable for elderly monitoring applications. Although conventional HAR utilizes RGB video sensors, HAR could be greatly improved with depth video sensors which produce depth or distance information. In this paper, a depth-based life logging HAR system is designed to recognize the daily activities of elderly people and turn these environments into an intelligent living space. Initially, a depth imaging sensor is used to capture depth silhouettes. Based on these silhouettes, human skeletons with joint information are produced which are further used for activity recognition and generating their life logs. The life-logging system is divided into two processes. Firstly, the training system includes data collection using a depth camera, feature extraction and training for each activity via Hidden Markov Models. Secondly, after training, the recognition engine starts to recognize the learned activities and produces life logs. The system was evaluated using life logging features against principal component and independent component features and achieved satisfactory recognition rates against the conventional approaches. Experiments conducted on the smart indoor activity datasets and the MSRDailyActivity3D dataset show promising results. The proposed system is directly applicable to any elderly monitoring system, such as monitoring healthcare problems for elderly people, or examining the indoor activities of people at home, office or hospital.

Citing Articles

Indoor Infrared Sensor Layout Optimization for Elderly Monitoring Based on Fused Genetic Gray Wolf Optimization (FGGWO) Algorithm.

Chen S, Chen Y, Feng M Sensors (Basel). 2024; 24(16).

PMID: 39205086 PMC: 11359595. DOI: 10.3390/s24165393.


Predictive Data Analytics in Telecare and Telehealth: Systematic Scoping Review.

Anderson E, Lennon M, Kavanagh K, Weir N, Kernaghan D, Roper M Online J Public Health Inform. 2024; 16:e57618.

PMID: 39110501 PMC: 11339581. DOI: 10.2196/57618.


A CNN Model for Physical Activity Recognition and Energy Expenditure Estimation from an Eyeglass-Mounted Wearable Sensor.

Hossain M, LaMunion S, Crouter S, Melanson E, Sazonov E Sensors (Basel). 2024; 24(10).

PMID: 38793899 PMC: 11125058. DOI: 10.3390/s24103046.


A Multi-Modal Egocentric Activity Recognition Approach towards Video Domain Generalization.

Papadakis A, Spyrou E Sensors (Basel). 2024; 24(8).

PMID: 38676108 PMC: 11054491. DOI: 10.3390/s24082491.


TCN-attention-HAR: human activity recognition based on attention mechanism time convolutional network.

Wei X, Wang Z Sci Rep. 2024; 14(1):7414.

PMID: 38548859 PMC: 10978978. DOI: 10.1038/s41598-024-57912-3.


References
1.
Rashidi P, Cook D, Holder L, Schmitter-Edgecombe M . Discovering Activities to Recognize and Track in a Smart Environment. IEEE Trans Knowl Data Eng. 2011; 23(4):527-539. PMC: 3100559. DOI: 10.1109/TKDE.2010.148. View

2.
Petersen P, Kandelman D, Arpin S, Ogawa H . Global oral health of older people--call for public health action. Community Dent Health. 2011; 27(4 Suppl 2):257-67. View

3.
Ward J, Lukowicz P, Troster G, Starner T . Activity recognition of assembly tasks using body-worn microphones and accelerometers. IEEE Trans Pattern Anal Mach Intell. 2006; 28(10):1553-67. DOI: 10.1109/TPAMI.2006.197. View

4.
Chen L, Chang S . An adaptive learning algorithm for principal component analysis. IEEE Trans Neural Netw. 1995; 6(5):1255-63. DOI: 10.1109/72.410369. View

5.
Mosabbeb E, Raahemifar K, Fathy M . Multi-view human activity recognition in distributed camera sensor networks. Sensors (Basel). 2013; 13(7):8750-70. PMC: 3758620. DOI: 10.3390/s130708750. View