» Articles » PMID: 34640758

Improving Human-Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression

Overview
Journal Sensors (Basel)
Publisher MDPI
Specialty Biotechnology
Date 2021 Oct 13
PMID 34640758
Citations 7
Authors
Affiliations
Soon will be listed here.
Abstract

An intriguing challenge in the human-robot interaction field is the prospect of endowing robots with emotional intelligence to make the interaction more genuine, intuitive, and natural. A crucial aspect in achieving this goal is the robot's capability to infer and interpret human emotions. Thanks to its design and open programming platform, the NAO humanoid robot is one of the most widely used agents for human interaction. As with person-to-person communication, facial expressions are the privileged channel for recognizing the interlocutor's emotional expressions. Although NAO is equipped with a facial expression recognition module, specific use cases may require additional features and affective computing capabilities that are not currently available. This study proposes a highly accurate convolutional-neural-network-based facial expression recognition model that is able to further enhance the NAO robot' awareness of human facial expressions and provide the robot with an interlocutor's arousal level detection capability. Indeed, the model tested during human-robot interactions was 91% and 90% accurate in recognizing happy and sad facial expressions, respectively; 75% accurate in recognizing surprised and scared expressions; and less accurate in recognizing neutral and angry expressions. Finally, the model was successfully integrated into the NAO SDK, thus allowing for high-performing facial expression classification with an inference time of 0.34 ± 0.04 s.

Citing Articles

An Emotion Recognition Method for Humanoid Robot Body Movements Based on a PSO-BP-RMSProp Neural Network.

Gao W, Jiang T, Zhai W, Zha F Sensors (Basel). 2024; 24(22).

PMID: 39599003 PMC: 11598485. DOI: 10.3390/s24227227.


e-VITA study protocol: EU-Japan virtual coach for smart aging.

Bevilacqua R, Stara V, Amabili G, Margaritini A, Benadduci M, Barbarossa F Front Public Health. 2024; 12:1256734.

PMID: 38544729 PMC: 10968892. DOI: 10.3389/fpubh.2024.1256734.


Empowering Smart Aging: Insights into the Technical Architecture of the e-VITA Virtual Coaching System for Older Adults.

Naccarelli R, DAgresti F, Roelen S, Jokinen K, Casaccia S, Revel G Sensors (Basel). 2024; 24(2).

PMID: 38276330 PMC: 10818560. DOI: 10.3390/s24020638.


Assessing Feasibility of Cognitive Impairment Testing Using Social Robotic Technology Augmented with Affective Computing and Emotional State Detection Systems.

Russo S, Lorusso L, DOnofrio G, Ciccone F, Tritto M, Nocco S Biomimetics (Basel). 2023; 8(6).

PMID: 37887606 PMC: 10604561. DOI: 10.3390/biomimetics8060475.


New Trends in Emotion Recognition Using Image Analysis by Neural Networks, A Systematic Review.

Cirneanu A, Popescu D, Iordache D Sensors (Basel). 2023; 23(16).

PMID: 37631629 PMC: 10458371. DOI: 10.3390/s23167092.


References
1.
So W, Wong M, Lam C, Lam W, Chui A, Lee T . Using a social robot to teach gestural recognition and production in children with autism spectrum disorders. Disabil Rehabil Assist Technol. 2017; 13(6):527-539. DOI: 10.1080/17483107.2017.1344886. View

2.
Stockli S, Schulte-Mecklenbeck M, Borer S, Samson A . Facial expression analysis with AFFDEX and FACET: A validation study. Behav Res Methods. 2017; 50(4):1446-1460. DOI: 10.3758/s13428-017-0996-1. View

3.
Ramis S, Buades J, Perales F . Using a Social Robot to Evaluate Facial Expressions in the Wild. Sensors (Basel). 2020; 20(23). PMC: 7727691. DOI: 10.3390/s20236716. View

4.
Ekman P . Are there basic emotions?. Psychol Rev. 1992; 99(3):550-3. DOI: 10.1037/0033-295x.99.3.550. View

5.
Norouzi-Gheidari N, Archambault P, Fung J . Effects of robot-assisted therapy on stroke rehabilitation in upper limbs: systematic review and meta-analysis of the literature. J Rehabil Res Dev. 2012; 49(4):479-96. DOI: 10.1682/jrrd.2010.10.0210. View