» Articles » PMID: 22267930

Human Facial Neural Activities and Gesture Recognition for Machine-interfacing Applications

Overview
Publisher Dove Medical Press
Specialty Biotechnology
Date 2012 Jan 24
PMID 22267930
Citations 6
Authors
Affiliations
Soon will be listed here.
Abstract

The authors present a new method of recognizing different human facial gestures through their neural activities and muscle movements, which can be used in machine-interfacing applications. Human-machine interface (HMI) technology utilizes human neural activities as input controllers for the machine. Recently, much work has been done on the specific application of facial electromyography (EMG)-based HMI, which have used limited and fixed numbers of facial gestures. In this work, a multipurpose interface is suggested that can support 2-11 control commands that can be applied to various HMI systems. The significance of this work is finding the most accurate facial gestures for any application with a maximum of eleven control commands. Eleven facial gesture EMGs are recorded from ten volunteers. Detected EMGs are passed through a band-pass filter and root mean square features are extracted. Various combinations of gestures with a different number of gestures in each group are made from the existing facial gestures. Finally, all combinations are trained and classified by a Fuzzy c-means classifier. In conclusion, combinations with the highest recognition accuracy in each group are chosen. An average accuracy >90% of chosen combinations proved their ability to be used as command controllers.

Citing Articles

High-resolution surface electromyographic activities of facial muscles during mimic movements in healthy adults: A prospective observational study.

Mueller N, Trentzsch V, Grassme R, Guntinas-Lichius O, Volk G, Anders C Front Hum Neurosci. 2022; 16:1029415.

PMID: 36579128 PMC: 9790991. DOI: 10.3389/fnhum.2022.1029415.


Face-Computer Interface (FCI): Intent Recognition Based on Facial Electromyography (fEMG) and Online Human-Computer Interface With Audiovisual Feedback.

Zhu B, Zhang D, Chu Y, Zhao X, Zhang L, Zhao L Front Neurorobot. 2021; 15:692562.

PMID: 34335220 PMC: 8322851. DOI: 10.3389/fnbot.2021.692562.


Hands-Free Human-Computer Interface Based on Facial Myoelectric Pattern Recognition.

Lu Z, Zhou P Front Neurol. 2019; 10:444.

PMID: 31114539 PMC: 6503102. DOI: 10.3389/fneur.2019.00444.


Predicting 3D lip shapes using facial surface EMG.

Eskes M, van Alphen M, Balm A, Smeele L, Brandsma D, van der Heijden F PLoS One. 2017; 12(4):e0175025.

PMID: 28406945 PMC: 5390998. DOI: 10.1371/journal.pone.0175025.


Muscle sensor model using small scale optical device for pattern recognitions.

Tamee K, Chaiwong K, Yothapakdee K, Yupapin P ScientificWorldJournal. 2013; 2013:346047.

PMID: 24222730 PMC: 3810185. DOI: 10.1155/2013/346047.


References
1.
Lopresti E, Brienza D, Angelo J, Gilbertson L . Neck range of motion and use of computer head controls. J Rehabil Res Dev. 2003; 40(3):199-211. View

2.
Oskoei M, Hu H . Support vector machine-based classification scheme for myoelectric control applied to upper limb. IEEE Trans Biomed Eng. 2008; 55(8):1956-65. DOI: 10.1109/TBME.2008.919734. View

3.
Barreto A, Scargle S, Adjouadi M . A practical EMG-based human-computer interface for users with motor disabilities. J Rehabil Res Dev. 2000; 37(1):53-63. View

4.
Knaflitz M, Bonato P . Time-frequency methods applied to muscle fatigue assessment during dynamic contractions. J Electromyogr Kinesiol. 1999; 9(5):337-50. DOI: 10.1016/s1050-6411(99)00009-7. View

5.
Wolpaw J, Loeb G, Allison B, Donchin E, Nascimento O, Heetderks W . BCI Meeting 2005--workshop on signals and recording methods. IEEE Trans Neural Syst Rehabil Eng. 2006; 14(2):138-41. DOI: 10.1109/TNSRE.2006.875583. View