» Articles » PMID: 20574687

A Body-centred Frame of Reference Drives Spatial Priming in Visual Search

Overview
Journal Exp Brain Res
Specialty Neurology
Date 2010 Jun 25
PMID 20574687
Citations 5
Authors
Affiliations
Soon will be listed here.
Abstract

Spatial priming in visual search is a well-documented phenomenon. If the target of a visual search is presented at the same location in subsequent trials, the time taken to find the target at this repeated target location is significantly reduced. Previous studies did not determine which spatial reference frame is used to code the location. At least two reference frames can be distinguished: an observer-related frame of reference (egocentric) or a scene-based frame of reference (allocentric). While past studies suggest that an allocentric reference frame is more effective, we found that an egocentric reference frame is at least as effective as an allocentric one (Ball et al. Neuropsychologia 47(6):1585-1591, 2009). Our previous study did not identify which specific egocentric reference frame was used for the priming: participants could have used a retinotopic or a body-centred frame of reference. Here, we disentangled the retinotopic and body-centred reference frames. In the retinotopic condition, the position of the target stimulus, when repeated, changed with the fixation position, whereas in the body-centred condition, the position of the target stimulus remained the same relative to the display, and thus to the body-midline, but was different relative to the fixation position. We used a conjunction search task to assess the generality of our previous findings. We found that participants relied on body-centred information and not retinotopic cues. Thus, we provide further evidence that egocentric information, and specifically body-centred information, can persist for several seconds, and that these effects are not specific to either a feature or a conjunction search paradigm.

Citing Articles

Egocentric and Allocentric Reference Frames Can Flexibly Support Contextual Cueing.

Zheng L, Dobroschke J, Pollmann S Front Psychol. 2021; 12:711890.

PMID: 34413816 PMC: 8369006. DOI: 10.3389/fpsyg.2021.711890.


Two Neural Circuits to Point Towards Home Position After Passive Body Displacements.

Blouin J, Saradjian A, Pialasse J, Manson G, Mouchnino L, Simoneau M Front Neural Circuits. 2019; 13:70.

PMID: 31736717 PMC: 6831616. DOI: 10.3389/fncir.2019.00070.


Positional priming of visual pop-out search is supported by multiple spatial reference frames.

Gokce A, Muller H, Geyer T Front Psychol. 2015; 6:838.

PMID: 26136718 PMC: 4468829. DOI: 10.3389/fpsyg.2015.00838.


Visual search as a tool for a quick and reliable assessment of cognitive functions in patients with multiple sclerosis.

Utz K, Hankeln T, Jung L, Lammer A, Waschbisch A, Lee D PLoS One. 2013; 8(11):e81531.

PMID: 24282604 PMC: 3840095. DOI: 10.1371/journal.pone.0081531.


Spatial priming in visual search: memory for body-centred information.

Ball K, Lane A, Ellison A, Schenk T Exp Brain Res. 2011; 212(3):477-85.

PMID: 21660465 DOI: 10.1007/s00221-011-2754-4.

References
1.
Connolly J, Andersen R, Goodale M . FMRI evidence for a 'parietal reach region' in the human brain. Exp Brain Res. 2003; 153(2):140-5. DOI: 10.1007/s00221-003-1587-1. View

2.
Shelton A, McNamara T . Orientation and perspective dependence in route and survey learning. J Exp Psychol Learn Mem Cogn. 2004; 30(1):158-70. DOI: 10.1037/0278-7393.30.1.158. View

3.
Committeri G, Galati G, Paradis A, Pizzamiglio L, Berthoz A, Lebihan D . Reference frames for spatial cognition: different brain areas are involved in viewer-, object-, and landmark-centered judgments about object location. J Cogn Neurosci. 2004; 16(9):1517-35. DOI: 10.1162/0898929042568550. View

4.
Hillstrom A . Repetition effects in visual search. Percept Psychophys. 2000; 62(4):800-17. DOI: 10.3758/bf03206924. View

5.
Tanaka Y, Shimojo S . Repetition priming reveals sustained facilitation and transient inhibition in reaction time. J Exp Psychol Hum Percept Perform. 2000; 26(4):1421-35. DOI: 10.1037//0096-1523.26.4.1421. View