» Articles » PMID: 28593606

Gazepath: An Eye-tracking Analysis Tool That Accounts for Individual Differences and Data Quality

Overview
Publisher Springer
Specialty Social Sciences
Date 2017 Jun 9
PMID 28593606
Citations 18
Authors
Affiliations
Soon will be listed here.
Abstract

Eye-trackers are a popular tool for studying cognitive, emotional, and attentional processes in different populations (e.g., clinical and typically developing) and participants of all ages, ranging from infants to the elderly. This broad range of processes and populations implies that there are many inter- and intra-individual differences that need to be taken into account when analyzing eye-tracking data. Standard parsing algorithms supplied by the eye-tracker manufacturers are typically optimized for adults and do not account for these individual differences. This paper presents gazepath, an easy-to-use R-package that comes with a graphical user interface (GUI) implemented in Shiny (RStudio Inc 2015). The gazepath R-package combines solutions from the adult and infant literature to provide an eye-tracking parsing method that accounts for individual differences and differences in data quality. We illustrate the usefulness of gazepath with three examples of different data sets. The first example shows how gazepath performs on free-viewing data of infants and adults, compared to standard EyeLink parsing. We show that gazepath controls for spurious correlations between fixation durations and data quality in infant data. The second example shows that gazepath performs well in high-quality reading data of adults. The third and last example shows that gazepath can also be used on noisy infant data collected with a Tobii eye-tracker and low (60 Hz) sampling rate.

Citing Articles

Dissociable genetic influences on eye movements during abstract versus naturalistic social scene viewing in infancy.

Portugal A, Taylor M, Tammimies K, Ronald A, Falck-Ytter T Sci Rep. 2025; 15(1):4100.

PMID: 39900629 PMC: 11791049. DOI: 10.1038/s41598-024-83557-3.


The fundamentals of eye tracking part 4: Tools for conducting an eye tracking study.

Niehorster D, Nystrom M, Hessels R, Andersson R, Benjamins J, Hansen D Behav Res Methods. 2025; 57(1):46.

PMID: 39762687 PMC: 11703944. DOI: 10.3758/s13428-024-02529-7.


Identification of Language-Induced Mental Load from Eye Behaviors in Virtual Reality.

Schirm J, Gomez-Vargas A, Perusquia-Hernandez M, Skarbez R, Isoyama N, Uchiyama H Sensors (Basel). 2023; 23(15).

PMID: 37571449 PMC: 10422404. DOI: 10.3390/s23156667.


The Development of Relational Reasoning: An Eyetracking Analysis of Strategy Use and Adaptation in Children and Adults Performing Matrix Completion.

Niebaum J, Munakata Y Open Mind (Camb). 2023; 7:197-220.

PMID: 37416068 PMC: 10320822. DOI: 10.1162/opmi_a_00078.


A functional model for studying common trends across trial time in eye tracking experiments.

Dong M, Telesca D, Sugar C, Shic F, Naples A, Johnson S Stat Biosci. 2023; 15(1):261-287.

PMID: 37077750 PMC: 10112660. DOI: 10.1007/s12561-022-09354-6.


References
1.
Hutzler F, Wimmer H . Eye movements of dyslexic children when reading in a regular orthography. Brain Lang. 2004; 89(1):235-42. DOI: 10.1016/S0093-934X(03)00401-2. View

2.
Koornneef A, Dotlacil J, van den Broek P, Sanders T . The influence of linguistic and cognitive factors on the time course of verb-based implicit causality. Q J Exp Psychol (Hove). 2015; 69(3):455-81. DOI: 10.1080/17470218.2015.1055282. View

3.
Wass S, Jones E, Gliga T, Smith T, Charman T, Johnson M . Shorter spontaneous fixation durations in infants with later emerging autism. Sci Rep. 2015; 5:8284. PMC: 4319149. DOI: 10.1038/srep08284. View

4.
Andersson R, Larsson L, Holmqvist K, Stridh M, Nystrom M . One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms. Behav Res Methods. 2016; 49(2):616-637. DOI: 10.3758/s13428-016-0738-9. View

5.
Rayner K, Castelhano M, Yang J . Eye movements and the perceptual span in older and younger readers. Psychol Aging. 2009; 24(3):755-60. DOI: 10.1037/a0014300. View