» Articles » PMID: 33378250

Spatiotemporal Factors Influence Sound-source Segregation in Localization Behavior

Overview
Journal J Neurophysiol
Specialties Neurology
Physiology
Date 2020 Dec 30
PMID 33378250
Authors
Affiliations
Soon will be listed here.
Abstract

To program a goal-directed response in the presence of acoustic reflections, the audio-motor system should suppress the detection of time-delayed sources. We examined the effects of spatial separation and interstimulus delay on the ability of human listeners to localize a pair of broadband sounds in the horizontal plane. Participants indicated how many sounds were heard and where these were perceived by making one or two head-orienting localization responses. Results suggest that perceptual fusion of the two sounds depends on delay and spatial separation. Leading and lagging stimuli in close spatial proximity required longer stimulus delays to be perceptually separated than those further apart. Whenever participants heard one sound, their localization responses for synchronous sounds were oriented to a weighted average of both source locations. For short delays, responses were directed toward the leading stimulus location. Increasing spatial separation enhanced this effect. For longer delays, responses were again directed toward a weighted average. When participants perceived two sounds, the first and the second response were directed to either of the leading and lagging source locations. Perceived locations were interchanged often in their temporal order (in ∼40% of trials). We show that the percept of two sounds occurring requires sufficient spatiotemporal separation, after which localization can be performed with high accuracy. We propose that the percept of temporal order of two concurrent sounds results from a different process than localization and discuss how dynamic lateral excitatory-inhibitory interactions within a spatial sensorimotor map could explain the findings. Sound localization requires spectral and temporal processing of implicit acoustic cues, and is seriously challenged when multiple sources coincide closely in space and time. We systematically varied spatial-temporal disparities for two sounds and instructed listeners to generate goal-directed head movements. We found that even when the auditory system has accurate representations of both sources, it still has trouble to decide whether the scene contained one or two sounds, and in which order they appeared.

Citing Articles

Bayesian prior uncertainty and surprisal elicit distinct neural patterns during sound localization in dynamic environments.

Bayram B, Meijer D, Barumerli R, Spierings M, Baumgartner R, Pomper U Sci Rep. 2025; 15(1):7931.

PMID: 40050310 PMC: 11885517. DOI: 10.1038/s41598-025-90269-9.

References
1.
Vliegen J, Van Grootel T, Van Opstal A . Dynamic sound localization during rapid eye-head gaze shifts. J Neurosci. 2004; 24(42):9291-302. PMC: 6730098. DOI: 10.1523/JNEUROSCI.2671-04.2004. View

2.
Santala O, Pulkki V . Directional perception of distributed sound sources. J Acoust Soc Am. 2011; 129(3):1522-30. DOI: 10.1121/1.3533727. View

3.
van Wanrooij M, Van Opstal A . Contribution of head shadow and pinna cues to chronic monaural sound localization. J Neurosci. 2004; 24(17):4163-71. PMC: 6729291. DOI: 10.1523/JNEUROSCI.0048-04.2004. View

4.
Tollin D, Henning G . Some aspects of the lateralization of echoed sound in man. I. The classical interaural-delay based precedence effect. J Acoust Soc Am. 1998; 104(5):3030-8. DOI: 10.1121/1.423884. View

5.
Brown A, Stecker G . The precedence effect: fusion and lateralization measures for headphone stimuli lateralized by interaural time and level differences. J Acoust Soc Am. 2013; 133(5):2883-98. PMC: 3663858. DOI: 10.1121/1.4796113. View