Back to Search
Start Over
A field test of computer-vision-based gaze estimation in psychology.
- Source :
- Behavior Research Methods; Mar2024, Vol. 56 Issue 3, p1900-1915, 16p
- Publication Year :
- 2024
-
Abstract
- Computer-vision-based gaze estimation refers to techniques that estimate gaze direction directly from video recordings of the eyes or face without the need for an eye tracker. Although many such methods exist, their validation is often found in the technical literature (e.g., computer science conference papers). We aimed to (1) identify which computer-vision-based gaze estimation methods are usable by the average researcher in fields such as psychology or education, and (2) evaluate these methods. We searched for methods that do not require calibration and have clear documentation. Two toolkits, OpenFace and OpenGaze, were found to fulfill these criteria. First, we present an experiment where adult participants fixated on nine stimulus points on a computer screen. We filmed their face with a camera and processed the recorded videos with OpenFace and OpenGaze. We conclude that OpenGaze is accurate and precise enough to be used in screen-based experiments with stimuli separated by at least 11 degrees of gaze angle. OpenFace was not sufficiently accurate for such situations but can potentially be used in sparser environments. We then examined whether OpenFace could be used with horizontally separated stimuli in a sparse environment with infant participants. We compared dwell measures based on OpenFace estimates to the same measures based on manual coding. We conclude that OpenFace gaze estimates may potentially be used with measures such as relative total dwell time to sparse, horizontally separated areas of interest, but should not be used to draw conclusions about measures such as dwell duration. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 1554351X
- Volume :
- 56
- Issue :
- 3
- Database :
- Complementary Index
- Journal :
- Behavior Research Methods
- Publication Type :
- Academic Journal
- Accession number :
- 176452174
- Full Text :
- https://doi.org/10.3758/s13428-023-02125-1