» Articles » PMID: 22303454

Seeing Emotion with Your Ears: Emotional Prosody Implicitly Guides Visual Attention to Faces

Overview
Journal PLoS One
Date 2012 Feb 4
PMID 22303454
Citations 16
Authors
Affiliations
Soon will be listed here.
Abstract

Interpersonal communication involves the processing of multimodal emotional cues, particularly facial expressions (visual modality) and emotional speech prosody (auditory modality) which can interact during information processing. Here, we investigated whether the implicit processing of emotional prosody systematically influences gaze behavior to facial expressions of emotion. We analyzed the eye movements of 31 participants as they scanned a visual array of four emotional faces portraying fear, anger, happiness, and neutrality, while listening to an emotionally-inflected pseudo-utterance (Someone migged the pazing) uttered in a congruent or incongruent tone. Participants heard the emotional utterance during the first 1250 milliseconds of a five-second visual array and then performed an immediate recall decision about the face they had just seen. The frequency and duration of first saccades and of total looks in three temporal windows ([0-1250 ms], [1250-2500 ms], [2500-5000 ms]) were analyzed according to the emotional content of faces and voices. Results showed that participants looked longer and more frequently at faces that matched the prosody in all three time windows (emotion congruency effect), although this effect was often emotion-specific (with greatest effects for fear). Effects of prosody on visual attention to faces persisted over time and could be detected long after the auditory information was no longer present. These data imply that emotional prosody is processed automatically during communication and that these cues play a critical role in how humans respond to related visual cues in the environment, such as facial expressions.

Citing Articles

Setting the tone: crossmodal emotional face-voice combinations in continuous flash suppression.

Muller U, Gerdes A, Alpers G Front Psychol. 2025; 15:1472489.

PMID: 39886372 PMC: 11780550. DOI: 10.3389/fpsyg.2024.1472489.


Deployment of attention to facial expressions varies as a function of emotional quality-but not in alexithymic individuals.

Surber C, Hoepfel D, Gunther V, Kersting A, Rufer M, Suslow T Front Psychiatry. 2024; 15:1338194.

PMID: 38510803 PMC: 10950908. DOI: 10.3389/fpsyt.2024.1338194.


Effects of emotion words activation and satiation on facial expression perception: evidence from behavioral and ERP investigations.

Xu Q, Wang W, Yang Y, Li W Front Psychiatry. 2023; 14:1192450.

PMID: 37588024 PMC: 10425554. DOI: 10.3389/fpsyt.2023.1192450.


Pupil dilation reflects the dynamic integration of audiovisual emotional speech.

Arias Sarah P, Hall L, Saitovitch A, Aucouturier J, Zilbovicius M, Johansson P Sci Rep. 2023; 13(1):5507.

PMID: 37016041 PMC: 10073148. DOI: 10.1038/s41598-023-32133-2.


Covert Attention to Gestures Is Sufficient for Information Uptake.

Kandana Arachchige K, Blekic W, Simoes Loureiro I, Lefebvre L Front Psychol. 2021; 12:776867.

PMID: 34917002 PMC: 8669744. DOI: 10.3389/fpsyg.2021.776867.


References
1.
Dara C, Monetta L, Pell M . Vocal emotion processing in Parkinson's disease: reduced sensitivity to negative emotions. Brain Res. 2007; 1188:100-11. DOI: 10.1016/j.brainres.2007.10.034. View

2.
Ohman A, Lundqvist D, Esteves F . The face in the crowd revisited: a threat advantage with schematic stimuli. J Pers Soc Psychol. 2001; 80(3):381-96. DOI: 10.1037/0022-3514.80.3.381. View

3.
Massaro D, Egan P . Perceiving affect from the voice and the face. Psychon Bull Rev. 2013; 3(2):215-21. DOI: 10.3758/BF03212421. View

4.
Bostanov V, Kotchoubey B . Recognition of affective prosody: continuous wavelet measures of event-related brain potentials to emotional exclamations. Psychophysiology. 2004; 41(2):259-68. DOI: 10.1111/j.1469-8986.2003.00142.x. View

5.
Pourtois G, De Gelder B, Vroomen J, Rossion B, Crommelinck M . The time-course of intermodal binding between seeing and hearing affective information. Neuroreport. 2000; 11(6):1329-33. DOI: 10.1097/00001756-200004270-00036. View