» Articles » PMID: 32054884

Gaze-in-wild: A Dataset for Studying Eye and Head Coordination in Everyday Activities

Overview
Journal Sci Rep
Specialty Science
Date 2020 Feb 15
PMID 32054884
Citations 29
Authors
Affiliations
Soon will be listed here.
Abstract

The study of gaze behavior has primarily been constrained to controlled environments in which the head is fixed. Consequently, little effort has been invested in the development of algorithms for the categorization of gaze events (e.g. fixations, pursuits, saccade, gaze shifts) while the head is free, and thus contributes to the velocity signals upon which classification algorithms typically operate. Our approach was to collect a novel, naturalistic, and multimodal dataset of eye + head movements when subjects performed everyday tasks while wearing a mobile eye tracker equipped with an inertial measurement unit and a 3D stereo camera. This Gaze-in-the-Wild dataset (GW) includes eye + head rotational velocities (deg/s), infrared eye images and scene imagery (RGB + D). A portion was labelled by coders into gaze motion events with a mutual agreement of 0.74 sample based Cohen's κ. This labelled data was used to train and evaluate two machine learning algorithms, Random Forest and a Recurrent Neural Network model, for gaze event classification. Assessment involved the application of established and novel event based performance metrics. Classifiers achieve ~87% human performance in detecting fixations and saccades but fall short (50%) on detecting pursuit movements. Moreover, pursuit classification is far worse in the absence of head movement information. A subsequent analysis of feature significance in our best performing model revealed that classification can be done using only the magnitudes of eye and head movements, potentially removing the need for calibration between the head and eye tracking systems. The GW dataset, trained classifiers and evaluation metrics will be made publicly available with the intention of facilitating growth in the emerging area of head-free gaze event classification.

Citing Articles

Cognitive load affects gaze dynamics during real-world tasks.

Martinez-Cedillo A, Gavrila N, Mishra A, Geangu E, Foulsham T Exp Brain Res. 2025; 243(4):82.

PMID: 40029455 PMC: 11876210. DOI: 10.1007/s00221-025-07037-4.


The fundamentals of eye tracking part 4: Tools for conducting an eye tracking study.

Niehorster D, Nystrom M, Hessels R, Andersson R, Benjamins J, Hansen D Behav Res Methods. 2025; 57(1):46.

PMID: 39762687 PMC: 11703944. DOI: 10.3758/s13428-024-02529-7.


A dataset of paired head and eye movements during visual tasks in virtual environments.

Rubow C, Tsai C, Brewer E, Mattson C, Brown D, Zhang H Sci Data. 2024; 11(1):1328.

PMID: 39639071 PMC: 11621368. DOI: 10.1038/s41597-024-04184-1.


A Child-Friendly Wearable Device for Quantifying Environmental Risk Factors for Myopia.

Gibaldi A, Harb E, Wildsoet C, Banks M Transl Vis Sci Technol. 2024; 13(10):28.

PMID: 39422897 PMC: 11498637. DOI: 10.1167/tvst.13.10.28.


The visual experience dataset: Over 200 recorded hours of integrated eye movement, odometry, and egocentric video.

Greene M, Balas B, Lescroart M, MacNeilage P, Hart J, Binaee K J Vis. 2024; 24(11):6.

PMID: 39377740 PMC: 11466363. DOI: 10.1167/jov.24.11.6.


References
1.
Allison R, Eizenman M, Cheung B . Combined head and eye tracking system for dynamic testing of the vestibular system. IEEE Trans Biomed Eng. 1996; 43(11):1073-82. DOI: 10.1109/10.541249. View

2.
Komogortsev O, Karpov A . Automated classification and scoring of smooth pursuit eye movements in the presence of fixations and saccades. Behav Res Methods. 2012; 45(1):203-15. DOI: 10.3758/s13428-012-0234-9. View

3.
Hessels R, Niehorster D, Nystrom M, Andersson R, Hooge I . Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers. R Soc Open Sci. 2018; 5(8):180502. PMC: 6124022. DOI: 10.1098/rsos.180502. View

4.
Larsson L, Schwaller A, Nystrom M, Stridh M . Head movement compensation and multi-modal event detection in eye-tracking data for unconstrained head movements. J Neurosci Methods. 2016; 274:13-26. DOI: 10.1016/j.jneumeth.2016.09.005. View

5.
Daye P, Blohm G, Lefevre P . Catch-up saccades in head-unrestrained conditions reveal that saccade amplitude is corrected using an internal model of target movement. J Vis. 2014; 14(1). PMC: 4523018. DOI: 10.1167/14.1.12. View