» Articles » PMID: 23202431

Sensory Processing During Viewing of Cinematographic Material: Computational Modeling and Functional Neuroimaging

Overview
Journal Neuroimage
Specialty Radiology
Date 2012 Dec 4
PMID 23202431
Citations 13
Authors
Affiliations
Soon will be listed here.
Abstract

The investigation of brain activity using naturalistic, ecologically-valid stimuli is becoming an important challenge for neuroscience research. Several approaches have been proposed, primarily relying on data-driven methods (e.g. independent component analysis, ICA). However, data-driven methods often require some post-hoc interpretation of the imaging results to draw inferences about the underlying sensory, motor or cognitive functions. Here, we propose using a biologically-plausible computational model to extract (multi-)sensory stimulus statistics that can be used for standard hypothesis-driven analyses (general linear model, GLM). We ran two separate fMRI experiments, which both involved subjects watching an episode of a TV-series. In Exp 1, we manipulated the presentation by switching on-and-off color, motion and/or sound at variable intervals, whereas in Exp 2, the video was played in the original version, with all the consequent continuous changes of the different sensory features intact. Both for vision and audition, we extracted stimulus statistics corresponding to spatial and temporal discontinuities of low-level features, as well as a combined measure related to the overall stimulus saliency. Results showed that activity in occipital visual cortex and the superior temporal auditory cortex co-varied with changes of low-level features. Visual saliency was found to further boost activity in extra-striate visual cortex plus posterior parietal cortex, while auditory saliency was found to enhance activity in the superior temporal cortex. Data-driven ICA analyses of the same datasets also identified "sensory" networks comprising visual and auditory areas, but without providing specific information about the possible underlying processes, e.g., these processes could relate to modality, stimulus features and/or saliency. We conclude that the combination of computational modeling and GLM enables the tracking of the impact of bottom-up signals on brain activity during viewing of complex and dynamic multisensory stimuli, beyond the capability of purely data-driven approaches.

Citing Articles

Differential correlates of fear and anxiety in salience perception: A behavioral and ERP study with adolescents.

Oliveira M, Fernandes C, Barbosa F, Ferreira-Santos F Cogn Affect Behav Neurosci. 2024; 24(1):143-155.

PMID: 38267798 PMC: 10827851. DOI: 10.3758/s13415-024-01159-y.


The auditory stimulus reduced the visual inhibition of return: Evidence from psychophysiological interaction analysis.

He Y, Peng X, Sun J, Tang X, Wang A, Zhang M Hum Brain Mapp. 2023; 44(10):4152-4164.

PMID: 37195056 PMC: 10258538. DOI: 10.1002/hbm.26336.


Eye movement behavior in a real-world virtual reality task reveals ADHD in children.

Merzon L, Pettersson K, Aronen E, Huhdanpaa H, Seesjarvi E, Henriksson L Sci Rep. 2022; 12(1):20308.

PMID: 36434040 PMC: 9700686. DOI: 10.1038/s41598-022-24552-4.


Time-varying measures of cerebral network centrality correlate with visual saliency during movie watching.

Ogawa A Brain Behav. 2021; 11(9):e2334.

PMID: 34435748 PMC: 8442596. DOI: 10.1002/brb3.2334.


Decoding Auditory Saliency from Brain Activity Patterns during Free Listening to Naturalistic Audio Excerpts.

Zhao S, Han J, Jiang X, Huang H, Liu H, Lv J Neuroinformatics. 2018; 16(3-4):309-324.

PMID: 29488069 DOI: 10.1007/s12021-018-9358-0.


References
1.
Driver J, Noesselt T . Multisensory interplay reveals crossmodal influences on 'sensory-specific' brain regions, neural responses, and judgments. Neuron. 2008; 57(1):11-23. PMC: 2427054. DOI: 10.1016/j.neuron.2007.12.013. View

2.
Tanabe H, Honda M, Sadato N . Functionally segregated neural substrates for arbitrary audiovisual paired-association learning. J Neurosci. 2005; 25(27):6409-18. PMC: 6725270. DOI: 10.1523/JNEUROSCI.0636-05.2005. View

3.
L Seghier M, Price C . Dissociating functional brain networks by decoding the between-subject variability. Neuroimage. 2009; 45(2):349-59. PMC: 2652023. DOI: 10.1016/j.neuroimage.2008.12.017. View

4.
van Atteveldt N, Formisano E, Blomert L, Goebel R . The effect of temporal asynchrony on the multisensory integration of letters and speech sounds. Cereb Cortex. 2006; 17(4):962-74. DOI: 10.1093/cercor/bhl007. View

5.
Gottlieb J . From thought to action: the parietal cortex as a bridge between perception, action, and cognition. Neuron. 2007; 53(1):9-16. DOI: 10.1016/j.neuron.2006.12.009. View