Auditory Signals Evolve from Hybrid- to Eye-centered Coordinates in the Primate Superior Colliculus
Overview
Physiology
Authors
Affiliations
Visual and auditory spatial signals initially arise in different reference frames. It has been postulated that auditory signals are translated from a head-centered to an eye-centered frame of reference compatible with the visual spatial maps, but, to date, only various forms of hybrid reference frames for sound have been identified. Here, we show that the auditory representation of space in the superior colliculus involves a hybrid reference frame immediately after the sound onset but evolves to become predominantly eye centered, and more similar to the visual representation, by the time of a saccade to that sound. Specifically, during the first 500 ms after the sound onset, auditory response patterns (N = 103) were usually neither head nor eye centered: 64% of neurons showed such a hybrid pattern, whereas 29% were more eye centered and 8% were more head centered. This differed from the pattern observed for visual targets (N = 156): 86% were eye centered, <1% were head centered, and only 13% exhibited a hybrid of both reference frames. For auditory-evoked activity observed within 20 ms of the saccade (N = 154), the proportion of eye-centered response patterns increased to 69%, whereas the hybrid and head-centered response patterns dropped to 30% and <1%, respectively. This pattern approached, although did not quite reach, that observed for saccade-related activity for visual targets: 89% were eye centered, 11% were hybrid, and <1% were head centered (N = 162). The plainly eye-centered visual response patterns and predominantly eye-centered auditory motor response patterns lie in marked contrast to our previous study of the intraparietal cortex, where both visual and auditory sensory and motor-related activity used a predominantly hybrid reference frame (Mullette-Gillman et al. 2005, 2009). Our present findings indicate that auditory signals are ultimately translated into a reference frame roughly similar to that used for vision, but suggest that such signals might emerge only in motor areas responsible for directing gaze to visual and auditory stimuli.
A model of audio-visual motion integration during active self-movement.
Gallagher M, Haynes J, Culling J, Freeman T J Vis. 2025; 25(2):8.
PMID: 39969485 PMC: 11841688. DOI: 10.1167/jov.25.2.8.
Eye movements track prioritized auditory features in selective attention to natural speech.
Gehmacher Q, Schubert J, Schmidt F, Hartmann T, Reisinger P, Rosch S Nat Commun. 2024; 15(1):3692.
PMID: 38693186 PMC: 11063150. DOI: 10.1038/s41467-024-48126-2.
Parametric information about eye movements is sent to the ears.
Lovich S, King C, Murphy D, Landrum R, Shera C, Groh J Proc Natl Acad Sci U S A. 2023; 120(48):e2303562120.
PMID: 37988462 PMC: 10691342. DOI: 10.1073/pnas.2303562120.
Neural mechanisms for the localization of unexpected external motion.
Chinta S, Pluta S Nat Commun. 2023; 14(1):6112.
PMID: 37777516 PMC: 10542789. DOI: 10.1038/s41467-023-41755-z.
Brohl F, Kayser C J Neurosci. 2023; 43(45):7668-7677.
PMID: 37734948 PMC: 10634546. DOI: 10.1523/JNEUROSCI.0818-23.2023.