» Articles » PMID: 27155219

Matching Heard and Seen Speech: An ERP Study of Audiovisual Word Recognition

Overview
Journal Brain Lang
Publisher Elsevier
Date 2016 May 8
PMID 27155219
Citations 3
Authors
Affiliations
Soon will be listed here.
Abstract

Seeing articulatory gestures while listening to speech-in-noise (SIN) significantly improves speech understanding. However, the degree of this improvement varies greatly among individuals. We examined a relationship between two distinct stages of visual articulatory processing and the SIN accuracy by combining a cross-modal repetition priming task with ERP recordings. Participants first heard a word referring to a common object (e.g., pumpkin) and then decided whether the subsequently presented visual silent articulation matched the word they had just heard. Incongruent articulations elicited a significantly enhanced N400, indicative of a mismatch detection at the pre-lexical level. Congruent articulations elicited a significantly larger LPC, indexing articulatory word recognition. Only the N400 difference between incongruent and congruent trials was significantly correlated with individuals' SIN accuracy improvement in the presence of the talker's face.

Citing Articles

Electrophysiological Dynamics of Visual Speech Processing and the Role of Orofacial Effectors for Cross-Modal Predictions.

Michon M, Boncompte G, Lopez V Front Hum Neurosci. 2020; 14:538619.

PMID: 33192386 PMC: 7653187. DOI: 10.3389/fnhum.2020.538619.


The Cross-Modal Suppressive Role of Visual Context on Speech Intelligibility: An ERP Study.

Shen S, Kerlin J, Bortfeld H, Shahin A Brain Sci. 2020; 10(11).

PMID: 33147691 PMC: 7692090. DOI: 10.3390/brainsci10110810.


Atypical audiovisual word processing in school-age children with a history of specific language impairment: an event-related potential study.

Kaganovich N, Schumaker J, Rowland C J Neurodev Disord. 2016; 8(1):33.

PMID: 27597881 PMC: 5011345. DOI: 10.1186/s11689-016-9168-3.

References
1.
Norrix L, Plante E, Vance R . Auditory-visual speech integration by adults with and without language-learning disabilities. J Commun Disord. 2005; 39(1):22-36. DOI: 10.1016/j.jcomdis.2005.05.003. View

2.
Tye-Murray N, Sommers M, Spehar B . Auditory and visual lexical neighborhoods in audiovisual speech perception. Trends Amplif. 2007; 11(4):233-41. PMC: 4111531. DOI: 10.1177/1084713807307409. View

3.
Praamstra P, Stegeman D . Phonological effects on the auditory N400 event-related brain potential. Brain Res Cogn Brain Res. 1993; 1(2):73-86. DOI: 10.1016/0926-6410(93)90013-u. View

4.
Ten Oever S, Schroeder C, Poeppel D, van Atteveldt N, Zion-Golumbic E . Rhythmicity and cross-modal temporal cues facilitate detection. Neuropsychologia. 2014; 63:43-50. PMC: 4209287. DOI: 10.1016/j.neuropsychologia.2014.08.008. View

5.
Hickok G, Okada K, Serences J . Area Spt in the human planum temporale supports sensory-motor integration for speech processing. J Neurophysiol. 2009; 101(5):2725-32. DOI: 10.1152/jn.91099.2008. View