» Articles » PMID: 29456486

Generic HRTFs May Be Good Enough in Virtual Reality. Improving Source Localization Through Cross-Modal Plasticity

Overview
Journal Front Neurosci
Date 2018 Feb 20
PMID 29456486
Citations 13
Authors
Affiliations
Soon will be listed here.
Abstract

Auditory spatial localization in humans is performed using a combination of interaural time differences, interaural level differences, as well as spectral cues provided by the geometry of the ear. To render spatialized sounds within a virtual reality (VR) headset, either individualized or generic Head Related Transfer Functions (HRTFs) are usually employed. The former require arduous calibrations, but enable accurate auditory source localization, which may lead to a heightened sense of presence within VR. The latter obviate the need for individualized calibrations, but result in less accurate auditory source localization. Previous research on auditory source localization in the real world suggests that our representation of acoustic space is highly plastic. In light of these findings, we investigated whether auditory source localization could be improved for users of generic HRTFs via cross-modal learning. The results show that pairing a dynamic auditory stimulus, with a spatio-temporally aligned visual counterpart, enabled users of generic HRTFs to improve subsequent auditory source localization. Exposure to the auditory stimulus alone or to asynchronous audiovisual stimuli did not improve auditory source localization. These findings have important implications for human perception as well as the development of VR systems as they indicate that generic HRTFs may be enough to enable good auditory source localization in VR.

Citing Articles

Multisensory stimuli facilitate low-level perceptual learning on a difficult global motion task in virtual reality.

Fromm C, Maddox R, Polonenko M, Huxlin K, Diaz G PLoS One. 2025; 20(3):e0319007.

PMID: 40036211 PMC: 11878941. DOI: 10.1371/journal.pone.0319007.


Happy new ears: Rapid adaptation to novel spectral cues in vertical sound localization.

Parise C, Gori M, Finocchietti S, Ernst M, Esposito D, Tonelli A iScience. 2024; 27(12):111308.

PMID: 39640573 PMC: 11617380. DOI: 10.1016/j.isci.2024.111308.


Auditory localization: a comprehensive practical review.

Carlini A, Bordeau C, Ambard M Front Psychol. 2024; 15:1408073.

PMID: 39049946 PMC: 11267622. DOI: 10.3389/fpsyg.2024.1408073.


Sound localization in web-based 3D environments.

Rajguru C, Brianza G, Memoli G Sci Rep. 2022; 12(1):12107.

PMID: 35840617 PMC: 9287443. DOI: 10.1038/s41598-022-15931-y.


Virtual reality training improves dynamic balance in children with cerebral palsy.

Pourazar M, Bagherzadeh F, Mirakhori F Int J Dev Disabil. 2021; 67(6):429-434.

PMID: 34925773 PMC: 8676581. DOI: 10.1080/20473869.2019.1679471.


References
1.
Ghazanfar A, Schroeder C . Is neocortex essentially multisensory?. Trends Cogn Sci. 2006; 10(6):278-85. DOI: 10.1016/j.tics.2006.04.008. View

2.
Wightman F, Kistler D . Headphone simulation of free-field listening. II: Psychophysical validation. J Acoust Soc Am. 1989; 85(2):868-78. DOI: 10.1121/1.397558. View

3.
Recanzone G . Rapidly induced auditory plasticity: the ventriloquism aftereffect. Proc Natl Acad Sci U S A. 1998; 95(3):869-75. PMC: 33810. DOI: 10.1073/pnas.95.3.869. View

4.
Berger C, Ehrsson H . Mental Imagery Induces Cross-Modal Sensory Plasticity and Changes Future Auditory Perception. Psychol Sci. 2018; 29(6):926-935. DOI: 10.1177/0956797617748959. View

5.
Witten I, Knudsen E . Why seeing is believing: merging auditory and visual worlds. Neuron. 2005; 48(3):489-96. DOI: 10.1016/j.neuron.2005.10.020. View