» Articles » PMID: 23578016

Inter-subject Synchronization of Brain Responses During Natural Music Listening

Overview
Journal Eur J Neurosci
Specialty Neurology
Date 2013 Apr 13
PMID 23578016
Citations 48
Authors
Affiliations
Soon will be listed here.
Abstract

Music is a cultural universal and a rich part of the human experience. However, little is known about common brain systems that support the processing and integration of extended, naturalistic 'real-world' music stimuli. We examined this question by presenting extended excerpts of symphonic music, and two pseudomusical stimuli in which the temporal and spectral structure of the Natural Music condition were disrupted, to non-musician participants undergoing functional brain imaging and analysing synchronized spatiotemporal activity patterns between listeners. We found that music synchronizes brain responses across listeners in bilateral auditory midbrain and thalamus, primary auditory and auditory association cortex, right-lateralized structures in frontal and parietal cortex, and motor planning regions of the brain. These effects were greater for natural music compared to the pseudo-musical control conditions. Remarkably, inter-subject synchronization in the inferior colliculus and medial geniculate nucleus was also greater for the natural music condition, indicating that synchronization at these early stages of auditory processing is not simply driven by spectro-temporal features of the stimulus. Increased synchronization during music listening was also evident in a right-hemisphere fronto-parietal attention network and bilateral cortical regions involved in motor planning. While these brain structures have previously been implicated in various aspects of musical processing, our results are the first to show that these regions track structural elements of a musical stimulus over extended time periods lasting minutes. Our results show that a hierarchical distributed network is synchronized between individuals during the processing of extended musical sequences, and provide new insight into the temporal integration of complex and biologically salient auditory sequences.

Citing Articles

Consistent movement of viewers' facial keypoints while watching emotionally evocative videos.

Tripathi S, Garg R PLoS One. 2024; 19(5):e0302705.

PMID: 38758739 PMC: 11101037. DOI: 10.1371/journal.pone.0302705.


A model of time-varying music engagement.

Omigie D, Mencke I Philos Trans R Soc Lond B Biol Sci. 2023; 379(1895):20220421.

PMID: 38104598 PMC: 10725767. DOI: 10.1098/rstb.2022.0421.


Inter-subject correlations of EEG reflect subjective arousal and acoustic features of music.

Ueno F, Shimada S Front Hum Neurosci. 2023; 17:1225377.

PMID: 37671247 PMC: 10475548. DOI: 10.3389/fnhum.2023.1225377.


Why art? The role of arts in arts and health.

Vickhoff B Front Psychol. 2023; 14:765019.

PMID: 37034911 PMC: 10075207. DOI: 10.3389/fpsyg.2023.765019.


Modelling the perception of music in brain network dynamics.

Sawicki J, Hartmann L, Bader R, Scholl E Front Netw Physiol. 2023; 2:910920.

PMID: 36926090 PMC: 10013054. DOI: 10.3389/fnetp.2022.910920.


References
1.
Tzounopoulos T, Kraus N . Learning to encode timing: mechanisms of plasticity in the auditory brainstem. Neuron. 2009; 62(4):463-9. PMC: 2792730. DOI: 10.1016/j.neuron.2009.05.002. View

2.
Langner G, Schreiner C . Periodicity coding in the inferior colliculus of the cat. I. Neuronal mechanisms. J Neurophysiol. 1988; 60(6):1799-822. DOI: 10.1152/jn.1988.60.6.1799. View

3.
Chandrasekaran B, Kraus N, Wong P . Human inferior colliculus activity relates to individual differences in spoken language learning. J Neurophysiol. 2011; 107(5):1325-36. PMC: 3311681. DOI: 10.1152/jn.00923.2011. View

4.
Wang X, Merzenich M, Beitel R, Schreiner C . Representation of a species-specific vocalization in the primary auditory cortex of the common marmoset: temporal and spectral characteristics. J Neurophysiol. 1995; 74(6):2685-706. DOI: 10.1152/jn.1995.74.6.2685. View

5.
Abrams D, Nicol T, Zecker S, Kraus N . Right-hemisphere auditory cortex is dominant for coding syllable patterns in speech. J Neurosci. 2008; 28(15):3958-65. PMC: 2713056. DOI: 10.1523/JNEUROSCI.0187-08.2008. View