» Articles » PMID: 38693440

Exploration of Factors Affecting Webcam-based Automated Gaze Coding

Overview
Publisher Springer
Specialty Social Sciences
Date 2024 May 1
PMID 38693440
Authors
Affiliations
Soon will be listed here.
Abstract

Online experiments have been transforming the field of behavioral research, enabling researchers to increase sample sizes, access diverse populations, lower the costs of data collection, and promote reproducibility. The field of developmental psychology increasingly exploits such online testing approaches. Since infants cannot give explicit behavioral responses, one key outcome measure is infants' gaze behavior. In the absence of automated eyetrackers in participants' homes, automatic gaze classification from webcam data would make it possible to avoid painstaking manual coding. However, the lack of a controlled experimental environment may lead to various noise factors impeding automatic face detection or gaze classification. We created an adult webcam dataset that systematically reproduced noise factors from infant webcam studies which might affect automated gaze coding accuracy. We varied participants' left-right offset, distance to the camera, facial rotation, and the direction of the lighting source. Running two state-of-the-art classification algorithms (iCatcher+ and OWLET) revealed that facial detection performance was particularly affected by the lighting source, while gaze coding accuracy was consistently affected by the distance to the camera and lighting source. Morphing participants' faces to be unidentifiable did not generally affect the results, suggesting facial anonymization could be used when making online video data publicly available, for purposes of further study and transparency. Our findings will guide improving study design for infant and adult participants during online experiments. Moreover, training algorithms using our dataset will allow researchers to improve robustness and allow developmental psychologists to leverage online testing more efficiently.

Citing Articles

The fundamentals of eye tracking part 4: Tools for conducting an eye tracking study.

Niehorster D, Nystrom M, Hessels R, Andersson R, Benjamins J, Hansen D Behav Res Methods. 2025; 57(1):46.

PMID: 39762687 PMC: 11703944. DOI: 10.3758/s13428-024-02529-7.

References
1.
Aslin R . What's in a look?. Dev Sci. 2006; 10(1):48-53. PMC: 2493049. DOI: 10.1111/j.1467-7687.2007.00563.x. View

2.
Bacon D, Weaver H, Saffran J . A Framework for Online Experimenter-Moderated Looking-Time Studies Assessing Infants' Linguistic Knowledge. Front Psychol. 2021; 12:703839. PMC: 8497712. DOI: 10.3389/fpsyg.2021.703839. View

3.
Banki A, de Eccher M, Falschlehner L, Hoehl S, Markova G . Comparing Online Webcam- and Laboratory-Based Eye-Tracking for the Assessment of Infants' Audio-Visual Synchrony Perception. Front Psychol. 2022; 12:733933. PMC: 8787048. DOI: 10.3389/fpsyg.2021.733933. View

4.
Bergmann C, Tsuji S, Piccinini P, Lewis M, Braginsky M, Frank M . Promoting Replicability in Developmental Research Through Meta-analyses: Insights From Language Acquisition Research. Child Dev. 2018; 89(6):1996-2009. PMC: 6282795. DOI: 10.1111/cdev.13079. View

5.
Chouinard B, Scott K, Cusack R . Using automatic face analysis to score infant behaviour from video collected online. Infant Behav Dev. 2018; 54:1-12. DOI: 10.1016/j.infbeh.2018.11.004. View