» Articles » PMID: 26627802

Judging Sound Rotation when Listeners and Sounds Rotate: Sound Source Localization is a Multisystem Process

Overview
Journal J Acoust Soc Am
Date 2015 Dec 3
PMID 26627802
Citations 18
Authors
Affiliations
Soon will be listed here.
Abstract

In four experiments listeners were rotated or were stationary. Sounds came from a stationary loudspeaker or rotated from loudspeaker to loudspeaker around an azimuth array. When either sounds or listeners rotate the auditory cues used for sound source localization change, but in the everyday world listeners perceive sound rotation only when sounds rotate not when listeners rotate. In the everyday world sound source locations are referenced to positions in the environment (a world-centric reference system). The auditory cues for sound source location indicate locations relative to the head (a head-centric reference system), not locations relative to the world. This paper deals with a general hypothesis that the world-centric location of sound sources requires the auditory system to have information about auditory cues used for sound source location and cues about head position. The use of visual and vestibular information in determining rotating head position in sound rotation perception was investigated. The experiments show that sound rotation perception when sources and listeners rotate was based on acoustic, visual, and, perhaps, vestibular information. The findings are consistent with the general hypotheses and suggest that sound source localization is not based just on acoustics. It is a multisystem process.

Citing Articles

Molecular analysis of individual differences in talker search at the cocktail-party.

Lutfi R, Pastore T, Rodriguez B, Yost W, Lee J J Acoust Soc Am. 2022; 152(3):1804.

PMID: 36182280 PMC: 9507302. DOI: 10.1121/10.0014116.


Sound source localization is a multisystem process.

Yost W, Pastore M, Dorman M Acoust Sci Technol. 2021; 41(1):113-120.

PMID: 34305431 PMC: 8297655. DOI: 10.1250/ast.41.113.


Frequency-dependent integration of auditory and vestibular cues for self-motion perception.

Shayman C, Peterka R, Gallun F, Oh Y, Chang N, Hullar T J Neurophysiol. 2020; 123(3):936-944.

PMID: 31940239 PMC: 7099484. DOI: 10.1152/jn.00307.2019.


Individual listener differences in azimuthal front-back reversals.

Yost W, Pastore M J Acoust Soc Am. 2019; 146(4):2709.

PMID: 31671982 PMC: 6814437. DOI: 10.1121/1.5129555.


The relative size of auditory scenes of multiple talkers.

Yost W, Pastore M, Pulling K J Acoust Soc Am. 2019; 146(3):EL219.

PMID: 31590525 PMC: 6739207. DOI: 10.1121/1.5125007.