» Articles » PMID: 36004009

Learning Multisensory Cue Integration: A Computational Model of Crossmodal Synaptic Plasticity Enables Reliability-based Cue Weighting by Capturing Stimulus Statistics

Overview
Date 2022 Aug 25
PMID 36004009
Authors
Affiliations
Soon will be listed here.
Abstract

The brain forms unified, coherent, and accurate percepts of events occurring in the environment by integrating information from multiple senses through the process of multisensory integration. The neural mechanisms underlying this process, its development and its maturation in a multisensory environment are yet to be properly understood. Numerous psychophysical studies suggest that the multisensory cue integration process follows the principle of Bayesian estimation, where the contributions of individual sensory modalities are proportional to the relative reliabilities of the different sensory stimuli. In this article I hypothesize that experience dependent crossmodal synaptic plasticity may be a plausible mechanism underlying development of multisensory cue integration. I test this hypothesis a computational model that implements Bayesian multisensory cue integration using reliability-based cue weighting. The model uses crossmodal synaptic plasticity to capture stimulus statistics within synaptic weights that are adapted to reflect the relative reliabilities of the participating stimuli. The model is embodied in a simulated robotic agent that learns to localize an audio-visual target by integrating spatial location cues extracted from of auditory and visual sensory modalities. Results of multiple randomized target localization trials in simulation indicate that the model is able to learn modality-specific synaptic weights proportional to the relative reliabilities of the auditory and visual stimuli. The proposed model with learned synaptic weights is also compared with a maximum-likelihood estimation model for cue integration regression analysis. Results indicate that the proposed model reflects maximum-likelihood estimation.

Citing Articles

Cue modality modulates interaction between exogenous spatial attention and audiovisual integration.

Wang A, Zhang H, Lu M, Wang J, Tang X, Zhang M Exp Brain Res. 2024; 243(1):26.

PMID: 39699643 DOI: 10.1007/s00221-024-06970-0.

References
1.
Ursino M, Crisafulli A, di Pellegrino G, Magosso E, Cuppini C . Development of a Bayesian Estimator for Audio-Visual Integration: A Neurocomputational Study. Front Comput Neurosci. 2017; 11:89. PMC: 5633019. DOI: 10.3389/fncom.2017.00089. View

2.
Yu L, Rowland B, Stein B . Initiating the development of multisensory integration by manipulating sensory experience. J Neurosci. 2010; 30(14):4904-13. PMC: 2858413. DOI: 10.1523/JNEUROSCI.5575-09.2010. View

3.
Christensen-Dalsgaard J, Manley G . Directionality of the lizard ear. J Exp Biol. 2005; 208(Pt 6):1209-17. DOI: 10.1242/jeb.01511. View

4.
Porr B, Worgotter F . Strongly improved stability and faster convergence of temporal sequence learning by using input correlations only. Neural Comput. 2006; 18(6):1380-412. DOI: 10.1162/neco.2006.18.6.1380. View

5.
Cuppini C, Stein B, Rowland B . Development of the Mechanisms Governing Midbrain Multisensory Integration. J Neurosci. 2018; 38(14):3453-3465. PMC: 5895037. DOI: 10.1523/JNEUROSCI.2631-17.2018. View