» Articles » PMID: 35323868

What Are the Visuo-motor Tendencies of Omnidirectional Scene Free-viewing in Virtual Reality?

Overview
Journal J Vis
Specialty Ophthalmology
Date 2022 Mar 24
PMID 35323868
Authors
Affiliations
Soon will be listed here.
Abstract

Central and peripheral vision during visual tasks have been extensively studied on two-dimensional screens, highlighting their perceptual and functional disparities. This study has two objectives: replicating on-screen gaze-contingent experiments removing central or peripheral field of view in virtual reality, and identifying visuo-motor biases specific to the exploration of 360 scenes with a wide field of view. Our results are useful for vision modelling, with applications in gaze position prediction (e.g., content compression and streaming). We ask how previous on-screen findings translate to conditions where observers can use their head to explore stimuli. We implemented a gaze-contingent paradigm to simulate loss of vision in virtual reality, participants could freely view omnidirectional natural scenes. This protocol allows the simulation of vision loss with an extended field of view (\(\gt \)80°) and studying the head's contributions to visual attention. The time-course of visuo-motor variables in our pure free-viewing task reveals long fixations and short saccades during first seconds of exploration, contrary to literature in visual tasks guided by instructions. We show that the effect of vision loss is reflected primarily on eye movements, in a manner consistent with two-dimensional screens literature. We hypothesize that head movements mainly serve to explore the scenes during free-viewing, the presence of masks did not significantly impact head scanning behaviours. We present new fixational and saccadic visuo-motor tendencies in a 360° context that we hope will help in the creation of gaze prediction models dedicated to virtual reality.

Citing Articles

Influence of open-source virtual-reality based gaze training on navigation performance in Retinitis pigmentosa patients in a crossover randomized controlled trial.

Neugebauer A, Sipatchin A, Stingl K, Ivanov I, Wahl S PLoS One. 2024; 19(2):e0291902.

PMID: 38300913 PMC: 10833541. DOI: 10.1371/journal.pone.0291902.


Flipping the world upside down: Using eye tracking in virtual reality to study visual search in inverted scenes.

Beitner J, Helbing J, Draschkow D, David E, Vo M J Eye Mov Res. 2023; 15(3).

PMID: 37215533 PMC: 10195094. DOI: 10.16910/jemr.15.3.5.


Eye and head movements while encoding and recognizing panoramic scenes in virtual reality.

Bischof W, Anderson N, Kingstone A PLoS One. 2023; 18(2):e0282030.

PMID: 36800398 PMC: 9937482. DOI: 10.1371/journal.pone.0282030.


Eye Tracking in Virtual Reality.

Anderson N, Bischof W, Kingstone A Curr Top Behav Neurosci. 2023; 65:73-100.

PMID: 36710302 DOI: 10.1007/7854_2022_409.


Effects of Transient Loss of Vision on Head and Eye Movements during Visual Search in a Virtual Environment.

David E, Beitner J, Vo M Brain Sci. 2020; 10(11).

PMID: 33198116 PMC: 7696943. DOI: 10.3390/brainsci10110841.

References
1.
Nuthmann A, Malcolm G . Eye guidance during real-world scene search: The role color plays in central and peripheral vision. J Vis. 2016; 16(2):3. DOI: 10.1167/16.2.3. View

2.
Rayner K, Bertera J . Reading without a fovea. Science. 1979; 206(4417):468-9. DOI: 10.1126/science.504987. View

3.
Larson A, Loschky L . The contributions of central versus peripheral vision to scene gist recognition. J Vis. 2009; 9(10):6.1-16. DOI: 10.1167/9.10.6. View

4.
Sitzmann V, Serrano A, Pavel A, Agrawala M, Gutierrez D, Masia B . Saliency in VR: How Do People Explore Virtual Environments?. IEEE Trans Vis Comput Graph. 2018; 24(4):1633-1642. DOI: 10.1109/TVCG.2018.2793599. View

5.
Aguilar C, Castet E . Gaze-contingent simulation of retinopathy: some potential pitfalls and remedies. Vision Res. 2011; 51(9):997-1012. DOI: 10.1016/j.visres.2011.02.010. View