» Articles » PMID: 11698688

Speech Comprehension is Correlated with Temporal Response Patterns Recorded from Auditory Cortex

Overview
Specialty Science
Date 2001 Nov 8
PMID 11698688
Citations 198
Authors
Affiliations
Soon will be listed here.
Abstract

Speech comprehension depends on the integrity of both the spectral content and temporal envelope of the speech signal. Although neural processing underlying spectral analysis has been intensively studied, less is known about the processing of temporal information. Most of speech information conveyed by the temporal envelope is confined to frequencies below 16 Hz, frequencies that roughly match spontaneous and evoked modulation rates of primary auditory cortex neurons. To test the importance of cortical modulation rates for speech processing, we manipulated the frequency of the temporal envelope of speech sentences and tested the effect on both speech comprehension and cortical activity. Magnetoencephalographic signals from the auditory cortices of human subjects were recorded while they were performing a speech comprehension task. The test sentences used in this task were compressed in time. Speech comprehension was degraded when sentence stimuli were presented in more rapid (more compressed) forms. We found that the average comprehension level, at each compression, correlated with (i) the similarity between the frequencies of the temporal envelopes of the stimulus and the subject's cortical activity ("stimulus-cortex frequency-matching") and (ii) the phase-locking (PL) between the two temporal envelopes ("stimulus-cortex PL"). Of these two correlates, PL was significantly more indicative for single-trial success. Our results suggest that the match between the speech rate and the a priori modulation capacities of the auditory cortex is a prerequisite for comprehension. However, this is not sufficient: stimulus-cortex PL should be achieved during actual sentence presentation.

Citing Articles

Reply to: The timing of speech-to-speech synchronization is governed by the P-center.

Assaneo M Commun Biol. 2025; 8(1):231.

PMID: 39948423 PMC: 11825936. DOI: 10.1038/s42003-025-07546-6.


The human auditory cortex concurrently tracks syllabic and phonemic timescales via acoustic spectral flux.

Giroud J, Trebuchon A, Mercier M, Davis M, Morillon B Sci Adv. 2024; 10(51):eado8915.

PMID: 39705351 PMC: 11661434. DOI: 10.1126/sciadv.ado8915.


Dog-human vocal interactions match dogs' sensory-motor tuning.

Deaux E, Piette T, Gaunet F, Legou T, Arnal L, Giraud A PLoS Biol. 2024; 22(10):e3002789.

PMID: 39352912 PMC: 11444399. DOI: 10.1371/journal.pbio.3002789.


Exploring the Interplay Between Language Comprehension and Cortical Tracking: The Bilingual Test Case.

Baus C, Millan I, Chen X, Blanco-Elorrieta E Neurobiol Lang (Camb). 2024; 5(2):484-496.

PMID: 38911463 PMC: 11192516. DOI: 10.1162/nol_a_00141.


Assessing Speech Audibility via Syllabic-Rate Neural Responses in Adults and Children With and Without Hearing Loss.

Pendyala V, Sethares W, Easwar V Trends Hear. 2024; 28:23312165241227815.

PMID: 38545698 PMC: 10976487. DOI: 10.1177/23312165241227815.


References
1.
Aram D, Ekelman B, Nation J . Preschoolers with language disorders: 10 years later. J Speech Hear Res. 1984; 27(2):232-44. DOI: 10.1044/jshr.2702.244. View

2.
Schreiner C, Urbas J . Representation of amplitude modulation in the auditory cortex of the cat. II. Comparison between cortical fields. Hear Res. 1988; 32(1):49-63. DOI: 10.1016/0378-5955(88)90146-3. View

3.
Ahissar E, Arieli A . Figuring space by time. Neuron. 2001; 32(2):185-201. DOI: 10.1016/s0896-6273(01)00466-4. View

4.
Mosher J, Lewis P, Leahy R . Multiple dipole modeling and localization from spatio-temporal MEG data. IEEE Trans Biomed Eng. 1992; 39(6):541-57. DOI: 10.1109/10.141192. View

5.
van den Brink W, Houtgast T . Efficient across-frequency integration in short-signal detection. J Acoust Soc Am. 1990; 87(1):284-91. DOI: 10.1121/1.399295. View