» Articles » PMID: 22561524

Turning Visual Search Time on Its Head

Overview
Journal Vision Res
Specialty Ophthalmology
Date 2012 May 8
PMID 22561524
Citations 18
Authors
Affiliations
Soon will be listed here.
Abstract

Our everyday visual experience frequently involves searching for objects in clutter. Why are some searches easy and others hard? It is generally believed that the time taken to find a target increases as it becomes similar to its surrounding distractors. Here, I show that while this is qualitatively true, the exact relationship is in fact not linear. In a simple search experiment, when subjects searched for a bar differing in orientation from its distractors, search time was inversely proportional to the angular difference in orientation. Thus, rather than taking search reaction time (RT) to be a measure of target-distractor similarity, we can literally turn search time on its head (i.e. take its reciprocal 1/RT) to obtain a measure of search dissimilarity that varies linearly over a large range of target-distractor differences. I show that this dissimilarity measure has the properties of a distance metric, and report two interesting insights come from this measure: First, for a large number of searches, search asymmetries are relatively rare and when they do occur, differ by a fixed distance. Second, search distances can be used to elucidate object representations that underlie search - for example, these representations are roughly invariant to three-dimensional view. Finally, search distance has a straightforward interpretation in the context of accumulator models of search, where it is proportional to the discriminative signal that is integrated to produce a response. This is consistent with recent studies that have linked this distance to neuronal discriminability in visual cortex. Thus, while search time remains the more direct measure of visual search, its reciprocal also has the potential for interesting and novel insights.

Citing Articles

Visual homogeneity computations in the brain enable solving property-based visual tasks.

Jacob G, Pramod R, Arun S Elife. 2025; 13.

PMID: 39964738 PMC: 11835389. DOI: 10.7554/eLife.93033.


What do we see behind an occluder? Amodal completion of statistical properties in complex objects.

Cherian T, Arun S Atten Percept Psychophys. 2024; 86(8):2721-2739.

PMID: 39461932 DOI: 10.3758/s13414-024-02948-w.


General object-based features account for letter perception.

Janini D, Hamblin C, Deza A, Konkle T PLoS Comput Biol. 2022; 18(9):e1010522.

PMID: 36155642 PMC: 9536565. DOI: 10.1371/journal.pcbi.1010522.


The time course of categorical and perceptual similarity effects in visual search.

Yeh L, Peelen M J Exp Psychol Hum Percept Perform. 2022; 48(10):1069-1082.

PMID: 35951407 PMC: 7616436. DOI: 10.1037/xhp0001034.


Feature similarity is non-linearly related to attentional selection: Evidence from visual search and sustained attention tasks.

Chapman A, Stormer V J Vis. 2022; 22(8):4.

PMID: 35834377 PMC: 9290316. DOI: 10.1167/jov.22.8.4.


References
1.
Logothetis N, Pauls J, Poggio T . Shape representation in the inferior temporal cortex of monkeys. Curr Biol. 1995; 5(5):552-63. DOI: 10.1016/s0960-9822(95)00108-4. View

2.
Vincent B . Search asymmetries: parallel processing of uncertain sensory information. Vision Res. 2011; 51(15):1741-50. DOI: 10.1016/j.visres.2011.05.017. View

3.
Sripati A, Olson C . Global image dissimilarity in macaque inferotemporal cortex predicts human visual search efficiency. J Neurosci. 2010; 30(4):1258-69. PMC: 2847854. DOI: 10.1523/JNEUROSCI.1908-09.2010. View

4.
Edelman S . Representation is representation of similarities. Behav Brain Sci. 1999; 21(4):449-67; discussion 467-98. DOI: 10.1017/s0140525x98001253. View

5.
Nagy A, Cone S . Asymmetries in simple feature searches for color. Vision Res. 1996; 36(18):2837-47. DOI: 10.1016/0042-6989(96)00046-6. View