» Articles » PMID: 21244923

Crossmodal Identification

Overview
Journal Trends Cogn Sci
Date 2011 Jan 20
PMID 21244923
Citations 30
Authors
Affiliations
Soon will be listed here.
Abstract

Everyday experience involves the continuous integration of information from multiple sensory inputs. Such crossmodal interactions are advantageous since the combined action of different sensory cues can provide information unavailable from their individual operation, reducing perceptual ambiguity and enhancing responsiveness. The behavioural consequences of such multimodal processes and their putative neural mechanisms have been investigated extensively with respect to orienting behaviour and, to a lesser extent, the crossmodal coordination of spatial attention. These operations are concerned mainly with the determination of stimulus location. However, information from different sensory streams can also be combined to assist stimulus identification. Psychophysical and physiological data indicate that these two crossmodal processes are subject to different temporal and spatial constraints both at the behavioural and neuronal level and involve the participation of distinct neural substrates. Here we review the evidence for such a dissociation and discuss recent neurophysiological, neuroanatomical and neuroimaging findings that shed light on the mechanisms underlying crossmodal identification, with specific reference to audio-visual speech perception.

Citing Articles

The Processing of Audiovisual Speech Is Linked with Vocabulary in Autistic and Nonautistic Children: An ERP Study.

Dunham-Carr K, Feldman J, Simon D, Edmunds S, Tu A, Kuang W Brain Sci. 2023; 13(7).

PMID: 37508976 PMC: 10377472. DOI: 10.3390/brainsci13071043.


Auditory perception dominates in motor rhythm reproduction.

Hildebrandt A, Griessbach E, Canal-Bruland R Perception. 2022; 51(6):403-416.

PMID: 35440242 PMC: 9121532. DOI: 10.1177/03010066221093604.


Excitatory Crossmodal Input to a Widespread Population of Primary Sensory Cortical Neurons.

Xiao Y, Wang L, Liu Y, Chen J, Zhang H, Gao Y Neurosci Bull. 2022; 38(10):1139-1152.

PMID: 35429324 PMC: 9554107. DOI: 10.1007/s12264-022-00855-4.


Mechanisms by which Early Eye Gaze to the Mouth During Multisensory Speech Influences Expressive Communication Development in Infant Siblings of Children with and without Autism.

Santapuram P, Feldman J, Bowman S, Raj S, Suzman E, Crowley S Mind Brain Educ. 2022; 16(1):62-74.

PMID: 35273650 PMC: 8903197. DOI: 10.1111/mbe.12310.


Substituting facial movements in singers changes the sounds of musical intervals.

Laeng B, Kuyateh S, Kelkar T Sci Rep. 2021; 11(1):22442.

PMID: 34789775 PMC: 8599708. DOI: 10.1038/s41598-021-01797-z.