» Articles » PMID: 21598055

Scene Context Influences Without Scene Gist: Eye Movements Guided by Spatial Associations in Visual Search

Overview
Specialty Psychology
Date 2011 May 21
PMID 21598055
Citations 33
Authors
Affiliations
Soon will be listed here.
Abstract

Although the use of semantic information about the world seems ubiquitous in every task we perform, it is not clear whether we rely on a scene's semantic information to guide attention when searching for something in a specific scene context (e.g., keys in one's living room). To address this question, we compared contribution of a scene's semantic information (i.e., scene gist) versus learned spatial associations between objects and context. Using the flash-preview-moving-window paradigm Castelhano and Henderson (Journal of Experimental Psychology: Human Perception and Performance 33:753-763, 2007), participants searched for target objects that were placed in either consistent or inconsistent locations and were semantically consistent or inconsistent with the scene gist. The results showed that learned spatial associations were used to guide search even in inconsistent contexts, providing evidence that scene context can affect search performance without consistent scene gist information. We discuss the results in terms of hierarchical organization of top-down influences of scene context.

Citing Articles

Eye movements reflect active statistical learning.

Arato J, Rothkopf C, Fiser J J Vis. 2024; 24(5):17.

PMID: 38819805 PMC: 11146064. DOI: 10.1167/jov.24.5.17.


Visual search for reach targets in actionable space is influenced by movement costs imposed by obstacles.

Moskowitz J, Fooken J, Castelhano M, Gallivan J, Flanagan J J Vis. 2023; 23(6):4.

PMID: 37289172 PMC: 10257340. DOI: 10.1167/jov.23.6.4.


Five Factors that Guide Attention in Visual Search.

Wolfe J, Horowitz T Nat Hum Behav. 2023; 1(3).

PMID: 36711068 PMC: 9879335. DOI: 10.1038/s41562-017-0058.


Visual search habits and the spatial structure of scenes.

Clarke A, Nowakowska A, Hunt A Atten Percept Psychophys. 2022; 84(6):1874-1885.

PMID: 35819714 PMC: 9338010. DOI: 10.3758/s13414-022-02506-2.


Peripheral vision in real-world tasks: A systematic review.

Vater C, Wolfe B, Rosenholtz R Psychon Bull Rev. 2022; 29(5):1531-1557.

PMID: 35581490 PMC: 9568462. DOI: 10.3758/s13423-022-02117-w.


References
1.
Jiang Y, Wagner L . What is learned in spatial contextual cuing--configuration or individual locations?. Percept Psychophys. 2004; 66(3):454-63. DOI: 10.3758/bf03194893. View

2.
Castelhano M, Henderson J . Initial scene representations facilitate eye movement guidance in visual search. J Exp Psychol Hum Percept Perform. 2007; 33(4):753-63. DOI: 10.1037/0096-1523.33.4.753. View

3.
Castelhano M, Pollatsek A, Cave K . Typicality aids search for an unspecified target, but only in identification and not in attentional guidance. Psychon Bull Rev. 2008; 15(4):795-801. DOI: 10.3758/pbr.15.4.795. View

4.
Sturz B, Kelly D, Brown M . Facilitation of learning spatial relations among locations by visual cues: generality across spatial configurations. Anim Cogn. 2009; 13(2):341-9. DOI: 10.1007/s10071-009-0283-3. View

5.
Epstein R, Ward E . How reliable are visual context effects in the parahippocampal place area?. Cereb Cortex. 2009; 20(2):294-303. PMC: 2803731. DOI: 10.1093/cercor/bhp099. View