» Articles » PMID: 19446457

Decoding of Emotional Information in Voice-sensitive Cortices

Overview
Journal Curr Biol
Publisher Cell Press
Specialty Biology
Date 2009 May 19
PMID 19446457
Citations 78
Authors
Affiliations
Soon will be listed here.
Abstract

The ability to correctly interpret emotional signals from others is crucial for successful social interaction. Previous neuroimaging studies showed that voice-sensitive auditory areas activate to a broad spectrum of vocally expressed emotions more than to neutral speech melody (prosody). However, this enhanced response occurs irrespective of the specific emotion category, making it impossible to distinguish different vocal emotions with conventional analyses. Here, we presented pseudowords spoken in five prosodic categories (anger, sadness, neutral, relief, joy) during event-related functional magnetic resonance imaging (fMRI), then employed multivariate pattern analysis to discriminate between these categories on the basis of the spatial response pattern within the auditory cortex. Our results demonstrate successful decoding of vocal emotions from fMRI responses in bilateral voice-sensitive areas, which could not be obtained by using averaged response amplitudes only. Pairwise comparisons showed that each category could be classified against all other alternatives, indicating for each emotion a specific spatial signature that generalized across speakers. These results demonstrate for the first time that emotional information is represented by distinct spatial patterns that can be decoded from brain activity in modality-specific cortical areas.

Citing Articles

The Mandarin Chinese auditory emotions stimulus database: A validated corpus of monosyllabic Chinese characters.

Li M, Li N, Zhou A, Yan H, Li Q, Ma C Behav Res Methods. 2025; 57(3):89.

PMID: 39900840 DOI: 10.3758/s13428-025-02607-4.


More than labels: neural representations of emotion words are widely distributed across the brain.

Lee K, Satpute A Soc Cogn Affect Neurosci. 2024; 19(1).

PMID: 38903026 PMC: 11259136. DOI: 10.1093/scan/nsae043.


Auditory dyadic interactions through the "eye" of the social brain: How visual is the posterior STS interaction region?.

Landsiedel J, Koldewyn K Imaging Neurosci (Camb). 2023; 1:1-20.

PMID: 37719835 PMC: 10503480. DOI: 10.1162/imag_a_00003.


Automatic Brain Categorization of Discrete Auditory Emotion Expressions.

Talwar S, Barbero F, Calce R, Collignon O Brain Topogr. 2023; 36(6):854-869.

PMID: 37639111 PMC: 10522533. DOI: 10.1007/s10548-023-00983-8.


Emotional sounds in space: asymmetrical representation within early-stage auditory areas.

Grisendi T, Clarke S, Da Costa S Front Neurosci. 2023; 17:1164334.

PMID: 37274197 PMC: 10235458. DOI: 10.3389/fnins.2023.1164334.