» Articles » PMID: 34335220

Face-Computer Interface (FCI): Intent Recognition Based on Facial Electromyography (fEMG) and Online Human-Computer Interface With Audiovisual Feedback

Overview
Date 2021 Aug 2
PMID 34335220
Citations 5
Authors
Affiliations
Soon will be listed here.
Abstract

Patients who have lost limb control ability, such as upper limb amputation and high paraplegia, are usually unable to take care of themselves. Establishing a natural, stable, and comfortable human-computer interface (HCI) for controlling rehabilitation assistance robots and other controllable equipments will solve a lot of their troubles. In this study, a complete limbs-free face-computer interface (FCI) framework based on facial electromyography (fEMG) including offline analysis and online control of mechanical equipments was proposed. Six facial movements related to eyebrows, eyes, and mouth were used in this FCI. In the offline stage, 12 models, eight types of features, and three different feature combination methods for model inputing were studied and compared in detail. In the online stage, four well-designed sessions were introduced to control a robotic arm to complete drinking water task in three ways (by touch screen, by fEMG with and without audio feedback) for verification and performance comparison of proposed FCI framework. Three features and one model with an average offline recognition accuracy of 95.3%, a maximum of 98.8%, and a minimum of 91.4% were selected for use in online scenarios. In contrast, the way with audio feedback performed better than that without audio feedback. All subjects completed the drinking task in a few minutes with FCI. The average and smallest time difference between touch screen and fEMG under audio feedback were only 1.24 and 0.37 min, respectively.

Citing Articles

Using principles of motor control to analyze performance of human machine interfaces.

Patwardhan S, Gladhill K, Joiner W, Schofield J, Lee B, Sikdar S Sci Rep. 2023; 13(1):13273.

PMID: 37582852 PMC: 10427694. DOI: 10.1038/s41598-023-40446-5.


Facial electromyogram-based facial gesture recognition for hands-free control of an AR/VR environment: optimal gesture set selection and validation of feasibility as an assistive technology.

Kim C, Kim C, Kim H, Kwak H, Lee W, Im C Biomed Eng Lett. 2023; 13(3):465-473.

PMID: 37519877 PMC: 10382369. DOI: 10.1007/s13534-023-00277-9.


Using Principles of Motor Control to Analyze Performance of Human Machine Interfaces.

Patwardhan S, Gladhill K, Joiner W, Schofield J, Sikdar S Res Sq. 2023; .

PMID: 37292730 PMC: 10246101. DOI: 10.21203/rs.3.rs-2763325/v1.


Crosstalk in Facial EMG and Its Reduction Using ICA.

Sato W, Kochiyama T Sensors (Basel). 2023; 23(5).

PMID: 36904924 PMC: 10007323. DOI: 10.3390/s23052720.


Human-machine interface-based wheelchair control using piezoelectric sensors based on face and tongue movements.

Bouyam C, Punsawad Y Heliyon. 2022; 8(11):e11679.

PMID: 36425438 PMC: 9679277. DOI: 10.1016/j.heliyon.2022.e11679.

References
1.
Nam Y, Koo B, Cichocki A, Choi S . GOM-Face: GKP, EOG, and EMG-based multimodal interface with application to humanoid robot control. IEEE Trans Biomed Eng. 2013; 61(2):453-62. DOI: 10.1109/TBME.2013.2280900. View

2.
Kaur A . Wheelchair control for disabled patients using EMG/EOG based human machine interface: a review. J Med Eng Technol. 2020; 45(1):61-74. DOI: 10.1080/03091902.2020.1853838. View

3.
Bastos-Filho T, Auat Cheein F, Muller S, Celeste W, de la Cruz C, Cavalieri D . Towards a new modality-independent interface for a robotic wheelchair. IEEE Trans Neural Syst Rehabil Eng. 2013; 22(3):567-84. DOI: 10.1109/TNSRE.2013.2265237. View

4.
Inzelberg L, Rand D, Steinberg S, David-Pur M, Hanein Y . A Wearable High-Resolution Facial Electromyography for Long Term Recordings in Freely Behaving Humans. Sci Rep. 2018; 8(1):2058. PMC: 5794977. DOI: 10.1038/s41598-018-20567-y. View

5.
Lu Z, Zhou P . Hands-Free Human-Computer Interface Based on Facial Myoelectric Pattern Recognition. Front Neurol. 2019; 10:444. PMC: 6503102. DOI: 10.3389/fneur.2019.00444. View