» Articles » PMID: 35146422

Improving Robotic Hand Prosthesis Control With Eye Tracking and Computer Vision: A Multimodal Approach Based on the Visuomotor Behavior of Grasping

Overview
Date 2022 Feb 11
PMID 35146422
Authors
Affiliations
Soon will be listed here.
Abstract

The complexity and dexterity of the human hand make the development of natural and robust control of hand prostheses challenging. Although a large number of control approaches were developed and investigated in the last decades, limited robustness in real-life conditions often prevented their application in clinical settings and in commercial products. In this paper, we investigate a multimodal approach that exploits the use of eye-hand coordination to improve the control of myoelectric hand prostheses. The analyzed data are from the publicly available MeganePro Dataset 1, that includes multimodal data from transradial amputees and able-bodied subjects while grasping numerous household objects with ten grasp types. A continuous grasp-type classification based on surface electromyography served as both intent detector and classifier. At the same time, the information provided by eye-hand coordination parameters, gaze data and object recognition in first-person videos allowed to identify the object a person aims to grasp. The results show that the inclusion of visual information significantly increases the average offline classification accuracy by up to 15.61 ± 4.22% for the transradial amputees and of up to 7.37 ± 3.52% for the able-bodied subjects, allowing trans-radial amputees to reach average classification accuracy comparable to intact subjects and suggesting that the robustness of hand prosthesis control based on grasp-type recognition can be significantly improved with the inclusion of visual information extracted by leveraging natural eye-hand coordination behavior and without placing additional cognitive burden on the user.

Citing Articles

Multimodal fusion of EMG and vision for human grasp intent inference in prosthetic hand control.

Zandigohar M, Han M, Sharif M, Gunay S, Furmanek M, Yarossi M Front Robot AI. 2024; 11:1312554.

PMID: 38476118 PMC: 10927746. DOI: 10.3389/frobt.2024.1312554.


Hand Grasp Pose Prediction Based on Motion Prior Field.

Shi X, Guo W, Xu W, Sheng X Biomimetics (Basel). 2023; 8(2).

PMID: 37366845 PMC: 10296099. DOI: 10.3390/biomimetics8020250.

References
1.
Atzori M, Cognolato M, Muller H . Deep Learning with Convolutional Neural Networks Applied to Electromyography Data: A Resource for the Classification of Movements for Prosthetic Hands. Front Neurorobot. 2016; 10:9. PMC: 5013051. DOI: 10.3389/fnbot.2016.00009. View

2.
Vujaklija I, Farina D, Aszmann O . New developments in prosthetic arm systems. Orthop Res Rev. 2019; 8:31-39. PMC: 6209370. DOI: 10.2147/ORR.S71468. View

3.
Khushaba R, Takruri M, Valls Miro J, Kodagoda S . Towards limb position invariant myoelectric pattern recognition using time-dependent spectral features. Neural Netw. 2014; 55:42-58. DOI: 10.1016/j.neunet.2014.03.010. View

4.
Castellini C, Artemiadis P, Wininger M, Ajoudani A, Alimusaj M, Bicchi A . Proceedings of the first workshop on Peripheral Machine Interfaces: going beyond traditional surface electromyography. Front Neurorobot. 2014; 8:22. PMC: 4133701. DOI: 10.3389/fnbot.2014.00022. View

5.
Hochreiter S, Schmidhuber J . Long short-term memory. Neural Comput. 1997; 9(8):1735-80. DOI: 10.1162/neco.1997.9.8.1735. View