A Procedure for Measuring Auditory and Audio-visual Speech-reception Thresholds for Sentences in Noise: Rationale, Evaluation, and Recommendations for Use
Overview
Authors
Affiliations
The strategy for measuring speech-reception thresholds for sentences in noise advocated by Plomp and Mimpen (Audiology, 18, 43-52, 1979) was modified to create a reliable test for measuring the difficulty which listeners have in speech reception, both auditorily and audio-visually. The test materials consist of 10 lists of 15 short sentences of homogeneous intelligibility when presented acoustically, and of different, but still homogeneous, intelligibility when presented audio-visually, in white noise. Homogeneity was achieved by applying phonetic and linguistic principles at the stage of compilation, followed by pilot testing and balancing of properties. To run the test, lists are presented at signal-to-noise ratios (SNRs) determined by an up-down psychophysical rule so as to estimate auditory and audio-visual speech-reception thresholds, defined as the SNRs at which the three content words in each sentence are identified correctly on 50% of trials. These thresholds provide measures of a subject's speech-reception abilities. The difference between them provides a measure of the benefit received from vision. It is shown that this measure is closely related to the accuracy with which subjects lip-read words in sentences with no acoustical information. In data from normally hearing adults, the standard deviations (s.d.s) of estimates of auditory speech reception threshold in noise (SRTN), audio-visual SRTN, and visual benefit are 1.2, 2.0, and 2.3 dB, respectively. Graphs are provided with which to estimate the trade-off between reliability and the number of lists presented, and to assess the significance of deviant scores from individual subjects.
Parmar B, Salorio-Corbetto M, Picinali L, Mahon M, Nightingale R, Somerset S Front Neurosci. 2024; 18:1491954.
PMID: 39697774 PMC: 11653081. DOI: 10.3389/fnins.2024.1491954.
Mai G, Jiang Z, Wang X, Tachtsidis I, Howell P Brain Topogr. 2024; 37(6):1139-1157.
PMID: 39042322 PMC: 11408581. DOI: 10.1007/s10548-024-01070-2.
Eye movement differences when recognising and learning moving and static faces.
Butcher N, Bennetts R, Sexton L, Barbanta A, Lander K Q J Exp Psychol (Hove). 2024; :17470218241252145.
PMID: 38644390 PMC: 11905330. DOI: 10.1177/17470218241252145.
Alfandari D, Richter M, Wendt D, Fiedler L, Naylor G Trends Hear. 2023; 27:23312165231196520.
PMID: 37847850 PMC: 10583525. DOI: 10.1177/23312165231196520.
Yeshoda K, Siya S, Chaithanyanayaka M, Pallavi R, Revathi R Indian J Otolaryngol Head Neck Surg. 2023; 75(2):765-771.
PMID: 36684823 PMC: 9840749. DOI: 10.1007/s12070-022-03465-8.