185 results on '"Hessels, Roy S."'
Search Results
152. Consequences of Eye Color, Positioning, and Head Movement for Eye-Tracking Data Quality in Infant Research
- Author
-
Hessels, Roy S., primary, Andersson, Richard, additional, Hooge, Ignace T. C., additional, Nyström, Marcus, additional, and Kemner, Chantal, additional
- Published
- 2015
- Full Text
- View/download PDF
153. Noise-robust fixation detection in eye movement data: Identification by two-means clustering (I2MC).
- Author
-
Hessels, Roy S., Niehorster, Diederick C., Kemner, Chantal, and Hooge, Ignace T. C.
- Abstract
Eye-tracking research in infants and older children has gained a lot of momentum over the last decades. Although eye-tracking research in these participant groups has become easier with the advance of the remote eye-tracker, this often comes at the cost of poorer data quality than in research with well-trained adults (Hessels, Andersson, Hooge, Nyström, & Kemner Infancy, 20, 601–633, 2015; Wass, Forssman, & Leppänen Infancy, 19, 427–460, 2014). Current fixation detection algorithms are not built for data from infants and young children. As a result, some researchers have even turned to hand correction of fixation detections (Saez de Urabain, Johnson, & Smith Behavior Research Methods, 47, 53–72, 2015). Here we introduce a fixation detection algorithm— identification by two-means clustering (I2MC)—built specifically for data across a wide range of noise levels and when periods of data loss may occur.We evaluated the I2MC algorithm against seven state-of-the-art event detection algorithms, and report that the I2MCalgorithm’s output is themost robust to high noise and data loss levels. The algorithm is automatic, works offline, and is suitable for eye-tracking data recorded with remote or tower-mounted eye-trackers using static stimuli. In addition to application of the I2MCalgorithm in eye-tracking research with infants, school children, and certain patient groups, the I2MC algorithm also may be useful when the noise and data loss levels are markedly different between trials, participants, or time points (e.g., longitudinal research). [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
154. Is There a Limit to the Superiority of Individuals with ASD in Visual Search?
- Author
-
Hessels, Roy S., primary, Hooge, Ignace T. C., additional, Snijders, Tineke M., additional, and Kemner, Chantal, additional
- Published
- 2013
- Full Text
- View/download PDF
155. How robust are wearable eye trackers to slow and fast head and body movements?
- Author
-
Hooge, Ignace T. C., Niehorster, Diederick C., Hessels, Roy S., Benjamins, Jeroen S., and Nyström, Marcus
- Subjects
- *
HEAD , *EYE tracking - Abstract
How well can modern wearable eye trackers cope with head and body movement? To investigate this question, we asked four participants to stand still, walk, skip, and jump while fixating a static physical target in space. We did this for six different eye trackers. All the eye trackers were capable of recording gaze during the most dynamic episodes (skipping and jumping). The accuracy became worse as movement got wilder. During skipping and jumping, the biggest error was 5.8∘. However, most errors were smaller than 3∘. We discuss the implications of decreased accuracy in the context of different research scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
156. Interocular conflict attracts attention
- Author
-
Paffen, Chris L. E., primary, Hessels, Roy S., additional, and Van der Stigchel, Stefan, additional
- Published
- 2011
- Full Text
- View/download PDF
157. Correction to: "Is human classification by experienced untrained observers a gold standard in fixation detection?".
- Author
-
Hooge, Ignace T. C., Niehorster, Diederick C., Nyström, Marcus, Andersson, Richard, and Hessels, Roy S.
- Subjects
CLASSIFICATION ,HUMAN beings ,STANDARDS - Abstract
A Correction to this paper has been published: https://doi.org/10.3758/s13428-017-0955-x [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
158. Large eye–head gaze shifts measured with a wearable eye tracker and an industrial camera.
- Author
-
Hooge, Ignace T. C., Niehorster, Diederick C., Nyström, Marcus, and Hessels, Roy S.
- Subjects
- *
EYE tracking , *GAZE , *EYE movements , *SOCIAL interaction , *RESEARCH personnel , *CAMERAS - Abstract
We built a novel setup to record large gaze shifts (up to 140 ∘ ). The setup consists of a wearable eye tracker and a high-speed camera with fiducial marker technology to track the head. We tested our setup by replicating findings from the classic eye–head gaze shift literature. We conclude that our new inexpensive setup is good enough to investigate the dynamics of large eye–head gaze shifts. This novel setup could be used for future research on large eye–head gaze shifts, but also for research on gaze during e.g., human interaction. We further discuss reference frames and terminology in head-free eye tracking. Despite a transition from head-fixed eye tracking to head-free gaze tracking, researchers still use head-fixed eye movement terminology when discussing world-fixed gaze phenomena. We propose to use more specific terminology for world-fixed phenomena, including gaze fixation, gaze pursuit, and gaze saccade. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
159. How Do Psychology Professors View the Relation Between Scientific Knowledge and Its Applicability and Societal Relevance?
- Author
-
Holleman, Gijs A., Hooge, Ignace T. C., Kemner, Chantal, and Hessels, Roy S.
- Subjects
- *
SCIENTIFIC knowledge , *PSYCHOLOGICAL research , *JOB applications , *RESEARCH personnel , *COLLEGE teachers - Abstract
How do researchers in psychology view the relation between scientific knowledge, its applicability, and its societal relevance? Most research on psychological science and its benefits to society is discussed from a bird's eye view (a meta-scientific perspective), by identifying general trends such as psychology's dominant focus on lab-based experiments and general descriptive theories. In recent years, several critics have argued that this focus has come at the cost of reduced practical and societal relevance. In this study, we interviewed Dutch psychology professors to gauge their views about the relation between psychological research and its relevance to society. We found that psychology professors engaged in a variety of activities to engage science with society, from work in clinical and applied settings, to consultancy, education, and science communication. However, we found that the role of theory when applying scientific knowledge to practical problems is far from straightforward. While most participants regarded theories as relevant to understanding general contexts of application, psychological theories were seldom directly related to specific applications. We compare and discuss our findings in the light of recent discussions about the lack of applicability and societal relevance of psychological science. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
160. Minimal reporting guideline for research involving eye tracking (2023 edition).
- Author
-
Dunn, Matt J., Alexander, Robert G., Amiebenomo, Onyekachukwu M., Arblaster, Gemma, Atan, Denize, Erichsen, Jonathan T., Ettinger, Ulrich, Giardini, Mario E., Gilchrist, Iain D., Hamilton, Ruth, Hessels, Roy S., Hodgins, Scott, Hooge, Ignace T. C., Jackson, Brooke S., Lee, Helena, Macknik, Stephen L., Martinez-Conde, Susana, Mcilreavy, Lee, Muratori, Lisa M., and Niehorster, Diederick C.
- Abstract
A guideline is proposed that comprises the minimum items to be reported in research studies involving an eye tracker and human or non-human primate participant(s). This guideline was developed over a 3-year period using a consensus-based process via an open invitation to the international eye tracking community. This guideline will be reviewed at maximum intervals of 4 years. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
161. The pupil-size artefact (PSA) across time, viewing direction, and different eye trackers.
- Author
-
Hooge, Ignace T. C., Niehorster, Diederick C., Hessels, Roy S., Cleveland, Dixon, and Nyström, Marcus
- Subjects
- *
EYE tracking , *CORNEA - Abstract
The pupil size artefact (PSA) is the gaze deviation reported by an eye tracker during pupil size changes if the eye does not rotate. In the present study, we ask three questions: 1) how stable is the PSA over time, 2) does the PSA depend on properties of the eye tracker set up, and 3) does the PSA depend on the participants' viewing direction? We found that the PSA is very stable over time for periods as long as 1 year, but may differ between participants. When comparing the magnitude of the PSA between eye trackers, we found the magnitude of the obtained PSA to be related to the direction of the eye-tracker-camera axis, suggesting that the angle between the participants' viewing direction and the camera axis affects the PSA. We then investigated the PSA as a function of the participants' viewing direction. The PSA was non-zero for viewing direction 0∘ and depended on the viewing direction. These findings corroborate the suggestion by Choe et al. (Vision Research 118(6755):48–59, 2016), that the PSA can be described by an idiosyncratic and a viewing direction-dependent component. Based on a simulation, we cannot claim that the viewing direction-dependent component of the PSA is caused by the optics of the cornea. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
162. Integrating GlassesViewer and GazeCode: an open-source data analysis alternative for mobile eye-tracking.
- Author
-
Benjamins, Jeroen S., Hessels, Roy S., and Niehorster, Diederick C.
- Subjects
- *
DATA analysis , *EYE tracking , *VISUAL perception , *INTEGRATED software , *TEXT files - Abstract
Mobile eye trackers often come with manufacturer software for data analysis, for instance Tobii Pro Lab for use with Tobii Pro Glasses 2. Tobii Pro Lab, however, is costly and a closed system, thus making it hard to expand or adjust to a researcher's data-analysis needs. Here, we present an alternative that combines the open-source packages GlassesViewer and GazeCode. GlassesViewer automatically parses data files directly from the SD card of the Tobii Pro Glasses and displays azimuth, elevation, gaze velocity, pupil diameter, gyroscope and accelerometer data in a GUI alongside scene- and optionally eye-camera video images. Data can then be inspected for data quality, further annotated (manually or with classifier algorithms) and stored along with the original data. GazeCode takes the annotations from GlassesViewer, presents them in an interface for manual mapping to the visual stimulus and exports to text files for further analysis. To demonstrate the flexibility and effectiveness of the integrated software, a simple experiment was performed. A subject looked for red pins on a message board while wearing the Tobii Pro Glasses 2. Using a custom button box, the subject indicated when a pin was found, the timestamps of which are fed straight into the eye-tracker data. Using GlassesViewer and GazeCode, fixations were classified in the eye-tracking data and manually mapped to the visual stimulus. The output of GazeCode can be used for further (statistical) analysis. Combining GlassesViewer and GazeCode offers an easy-to-use, open-source alternative to manufacturer software for mobile eye-tracking data analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2019
163. Do pupil-based binocular video eye trackers reliably measure vergence?
- Author
-
Hooge, Ignace T. C., Hessels, Roy S., and Nyström, Marcus
- Subjects
- *
EYE , *VISUAL perception , *ATTENTION control , *VIDEOS - Abstract
Distance to the binocular fixation point, vergence angle and fixation disparity are useful measures e.g., to study visual perception, binocular control in reading and attention in 3D. Are binocular pupil-based video eye trackers accurate enough to produce meaningful binocular measures? Recent research (Wyatt et al. 2010; Wildenmann & Schaeffel, 2013; Drewes et al., 2014) revealed large idiosyncratic systematic errors due to pupil-size changes. We investigated whether the pupil-size artefact in the separate eyes may cause the eye tracker to report apparent vergence changes when the eyeballs do not rotate. To evoke large and small pupils, observers continuously fixated a dot on a screen that slowly changed from black to white in a sinusoidal manner (0.125Hz). Gaze positions of both eyes were measured with an EyeLink 1000 plus. We obtained vergence changes up to 2° in the eye tracker signal. Inspection of the corneal refection signals confirmed that the reported vergence change was not due to actual eye rotation. Due to the pupil-size artefact, pupil-CR or pupil-only video eye trackers are not accurate enough to determine vergence, distance to the binocular fixation point and fixation disparity. [ABSTRACT FROM AUTHOR]
- Published
- 2019
164. What is a blink? Classifying and characterizing blinks in eye openness signals.
- Author
-
Nyström, Marcus, Andersson, Richard, Niehorster, Diederick C., Hessels, Roy S., and Hooge, Ignace T. C.
- Subjects
- *
BLINKING (Physiology) , *DATA quality , *EYELIDS , *EYE tracking - Abstract
Blinks, the closing and opening of the eyelids, are used in a wide array of fields where human function and behavior are studied. In data from video-based eye trackers, blink rate and duration are often estimated from the pupil-size signal. However, blinks and their parameters can be estimated only indirectly from this signal, since it does not explicitly contain information about the eyelid position. We ask whether blinks detected from an eye openness signal that estimates the distance between the eyelids (EO blinks) are comparable to blinks detected with a traditional algorithm using the pupil-size signal (PS blinks) and how robust blink detection is when data quality is low. In terms of rate, there was an almost-perfect overlap between EO and PS blink (F1 score: 0.98) when the head was in the center of the eye tracker's tracking range where data quality was high and a high overlap (F1 score 0.94) when the head was at the edge of the tracking range where data quality was worse. When there was a difference in blink rate between EO and PS blinks, it was mainly due to data loss in the pupil-size signal. Blink durations were about 60 ms longer in EO blinks compared to PS blinks. Moreover, the dynamics of EO blinks was similar to results from previous literature. We conclude that the eye openness signal together with our proposed blink detection algorithm provides an advantageous method to detect and describe blinks in greater detail. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
165. The impact of slippage on the data quality of head-worn eye trackers.
- Author
-
Niehorster, Diederick C., Santini, Thiago, Hessels, Roy S., Hooge, Ignace T. C., Kasneci, Enkelejda, and Nyström, Marcus
- Subjects
- *
DATA quality , *GAZE , *EYE tracking , *FACIAL expression , *EYE - Abstract
Mobile head-worn eye trackers allow researchers to record eye-movement data as participants freely move around and interact with their surroundings. However, participant behavior may cause the eye tracker to slip on the participant's head, potentially strongly affecting data quality. To investigate how this eye-tracker slippage affects data quality, we designed experiments in which participants mimic behaviors that can cause a mobile eye tracker to move. Specifically, we investigated data quality when participants speak, make facial expressions, and move the eye tracker. Four head-worn eye-tracking setups were used: (i) Tobii Pro Glasses 2 in 50 Hz mode, (ii) SMI Eye Tracking Glasses 2.0 60 Hz, (iii) Pupil-Labs' Pupil in 3D mode, and (iv) Pupil-Labs' Pupil with the Grip gaze estimation algorithm as implemented in the EyeRecToo software. Our results show that whereas gaze estimates of the Tobii and Grip remained stable when the eye tracker moved, the other systems exhibited significant errors (0.8–3.1∘ increase in gaze deviation over baseline) even for the small amounts of glasses movement that occurred during the speech and facial expressions tasks. We conclude that some of the tested eye-tracking setups may not be suitable for investigating gaze behavior when high accuracy is required, such as during face-to-face interaction scenarios. We recommend that users of mobile head-worn eye trackers perform similar tests with their setups to become aware of its characteristics. This will enable researchers to design experiments that are robust to the limitations of their particular eye-tracking setup. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
166. Attention Biases for Emotional Facial Expressions During a Free Viewing Task Increase Between 2.5 and 5 Years of Age.
- Author
-
Eskola, Eeva, Kataja, Eeva-Leena, Pelto, Juho, Tuulari, Jetro J., Hyönä, Jukka, Häikiö, Tuomo, Hessels, Roy S., Holmberg, Eeva, Nordenswan, Elisabeth, Karlsson, Hasse, Karlsson, Linnea, and Korja, Riikka
- Subjects
- *
RESEARCH , *EYE movements , *FACIAL expression , *EYE movement measurements , *COMPARATIVE studies , *DESCRIPTIVE statistics , *RESEARCH funding , *EMOTIONS , *STATISTICAL correlation , *ATTENTIONAL bias , *LONGITUDINAL method - Abstract
The normative, developmental changes in affect-biased attention during the preschool years are largely unknown. To investigate the attention bias for emotional versus neutral faces, an eye-tracking measurement and free viewing of paired pictures of facial expressions (i.e., happy, fearful, sad, or angry faces) and nonface pictures with neutral faces were conducted with 367 children participating in a Finnish cohort study at the age of 2.5 years and with 477 children at the age of 5 years, 216 of which having follow-up measurements. We found an attention-orienting bias for happy and fearful faces versus neutral faces at both age points. An attention-orienting bias for sad faces emerged between 2.5 and 5 years. In addition, there were significant biases in sustained attention toward happy, fearful, sad, and angry faces versus neutral faces, with a bias in sustained attention for fearful faces being the strongest. All biases in sustained attention increased between 2.5 and 5 years of age. Moderate correlations in saccadic latencies were found between 2.5 and 5 years. In conclusion, attention biases for emotional facial expressions seem to be age-specific and specific for the attentional subcomponent. This implies that future studies on affect-biased attention during the preschool years should use small age ranges and cover multiple subcomponents of attention. Public Significance Statement: This study suggests that 2.5- and 5-year-old children show an attentional preference for emotional faces over neutral faces in a free-viewing task. These attention biases for emotional faces increase between 2.5 and 5 years of age. The characterization of the typical development of emotional attention biases is important, as the emotional attention biases seem to have reciprocal connections to socioemotional well-being. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
167. Stable eye versus mouth preference in a live speech-processing task.
- Author
-
Viktorsson, Charlotte, Valtakari, Niilo V., Falck-Ytter, Terje, Hooge, Ignace T. C., Rudling, Maja, and Hessels, Roy S.
- Subjects
- *
MOUTH , *SPEECH , *INDIVIDUAL differences , *ENGLISH language - Abstract
Looking at the mouth region is thought to be a useful strategy for speech-perception tasks. The tendency to look at the eyes versus the mouth of another person during speech processing has thus far mainly been studied using screen-based paradigms. In this study, we estimated the eye-mouth-index (EMI) of 38 adult participants in a live setting. Participants were seated across the table from an experimenter, who read sentences out loud for the participant to remember in both a familiar (English) and unfamiliar (Finnish) language. No statistically significant difference in the EMI between the familiar and the unfamiliar languages was observed. Total relative looking time at the mouth also did not predict the number of correctly identified sentences. Instead, we found that the EMI was higher during an instruction phase than during the speech-processing task. Moreover, we observed high intra-individual correlations in the EMI across the languages and different phases of the experiment. We conclude that there are stable individual differences in looking at the eyes versus the mouth of another person. Furthermore, this behavior appears to be flexible and dependent on the requirements of the situation (speech processing or not). [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
168. The amplitude of small eye movements can be accurately estimated with video-based eye trackers.
- Author
-
Nyström, Marcus, Niehorster, Diederick C., Andersson, Richard, Hessels, Roy S., and Hooge, Ignace T. C.
- Subjects
- *
EYE movements , *CAMCORDERS , *SPATIAL resolution , *EYE tracking , *COMPUTER simulation - Abstract
Estimating the gaze direction with a digital video-based pupil and corneal reflection (P-CR) eye tracker is challenging partly since a video camera is limited in terms of spatial and temporal resolution, and because the captured eye images contain noise. Through computer simulation, we evaluated the localization accuracy of pupil-, and CR centers in the eye image for small eye rotations (≪ 1 deg). Results highlight how inaccuracies in center localization are related to 1) how many pixels the pupil and CR span in the eye camera image, 2) the method to compute the center of the pupil and CRs, and 3) the level of image noise. Our results provide a possible explanation to why the amplitude of small saccades may not be accurately estimated by many currently used video-based eye trackers. We conclude that eye movements with arbitrarily small amplitudes can be accurately estimated using the P-CR eye-tracking principle given that the level of image noise is low and the pupil and CR span enough pixels in the eye camera, or if localization of the CR is based on the intensity values in the eye image instead of a binary representation. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
169. Author Correction: Minimal reporting guideline for research involving eye tracking (2023 edition).
- Author
-
Dunn, Matt J., Alexander, Robert G., Amiebenomo, Onyekachukwu M., Arblaster, Gemma, Atan, Denize, Erichsen, Jonathan T., Ettinger, Ulrich, Giardini, Mario E., Gilchrist, Iain D., Hamilton, Ruth, Hessels, Roy S., Hodgins, Scott, Hooge, Ignace T. C., Jackson, Brooke S., Lee, Helena, Macknik, Stephen L., Martinez-Conde, Susana, Mcilreavy, Lee, Muratori, Lisa M., and Niehorster, Diederick C.
- Subjects
- *
EYE tracking - Abstract
This document is a correction notice for an article titled "Minimal reporting guideline for research involving eye tracking (2023 edition)" published in the journal Behavior Research Methods. The correction includes the addition of a funding note acknowledging support from various organizations. The authors state that the content of the article is their responsibility and does not necessarily reflect the views of the funding organizations. The publisher, Springer Nature, remains neutral regarding jurisdictional claims and institutional affiliations. The document also lists the names of the authors involved in the article. [Extracted from the article]
- Published
- 2024
- Full Text
- View/download PDF
170. Eye tracking in human interaction: Possibilities and limitations.
- Author
-
Valtakari, Niilo V., Hooge, Ignace T. C., Viktorsson, Charlotte, Nyström, Pär, Falck-Ytter, Terje, and Hessels, Roy S.
- Subjects
- *
EYE tracking , *SOCIAL interaction , *HUMAN behavior , *DECISION trees - Abstract
There is a long history of interest in looking behavior during human interaction. With the advance of (wearable) video-based eye trackers, it has become possible to measure gaze during many different interactions. We outline the different types of eye-tracking setups that currently exist to investigate gaze during interaction. The setups differ mainly with regard to the nature of the eye-tracking signal (head- or world-centered) and the freedom of movement allowed for the participants. These features place constraints on the research questions that can be answered about human interaction. We end with a decision tree to help researchers judge the appropriateness of specific setups. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
171. The Reality of "Real-Life" Neuroscience: A Commentary on Shamay-Tsoory and Mendelsohn (2019).
- Author
-
Holleman, Gijs A., Hooge, Ignace T. C., Kemner, Chantal, and Hessels, Roy S.
- Subjects
- *
NEUROSCIENCES , *SOCIAL support , *MATHEMATICAL models , *PSYCHOLOGY , *BEHAVIOR , *THEORY , *INTERPERSONAL relations - Abstract
The main thrust of Shamay-Tsoory and Mendelsohn's ecological approach is that "the use of real-life complex, dynamic, naturalistic stimuli provides a solid basis for understanding brain and behavior" (p. 851). Although we support the overall goal and objectives of Shamay-Tsoory and Mendelsohn's approach to "real-life" neuroscience, their review refers to the terms "ecological validity" and "representative design" in a manner different from that originally introduced by Egon Brunswik. Our aim is to clarify Brunswik's original definitions and briefly explain how these concepts pertain to the larger problem of generalizability, not just for history's sake, but because we believe that a proper understanding of these concepts is important for researchers who want to understand human behavior and the brain in the context of everyday experience, and because Brunswik's original ideas may contribute to Shamay-Tsoory and Mendelsohn's ecological approach. Finally, we argue that the popular and often misused concept of "ecological validity" is ill-formed, lacks specificity, and may even undermine the development of theoretically sound and tractable research. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
172. Gaze-action coupling, gaze-gesture coupling, and exogenous attraction of gaze in dyadic interactions.
- Author
-
Hessels RS, Li P, Balali S, Teunisse MK, Poppe R, Niehorster DC, Nyström M, Benjamins JS, Senju A, Salah AA, and Hooge ITC
- Abstract
In human interactions, gaze may be used to acquire information for goal-directed actions, to acquire information related to the interacting partner's actions, and in the context of multimodal communication. At present, there are no models of gaze behavior in the context of vision that adequately incorporate these three components. In this study, we aimed to uncover and quantify patterns of within-person gaze-action coupling, gaze-gesture and gaze-speech coupling, and coupling between one person's gaze and another person's manual actions, gestures, or speech (or exogenous attraction of gaze) during dyadic collaboration. We showed that in the context of a collaborative Lego Duplo-model copying task, within-person gaze-action coupling is strongest, followed by within-person gaze-gesture coupling, and coupling between gaze and another person's actions. When trying to infer gaze location from one's own manual actions, gestures, or speech or that of the other person, only one's own manual actions were found to lead to better inference compared to a baseline model. The improvement in inferring gaze location was limited, contrary to what might be expected based on previous research. We suggest that inferring gaze location may be most effective for constrained tasks in which different manual actions follow in a quick sequence, while gaze-gesture and gaze-speech coupling may be stronger in unconstrained conversational settings or when the collaboration requires more negotiation. Our findings may serve as an empirical foundation for future theory and model development, and may further be relevant in the context of action/intention prediction for (social) robotics and effective human-robot interaction., Competing Interests: Declarations Competing Interests The authors report that there are no competing interests to declare. Ethics Approval This study was performed in line with the principles of the Declaration of Helsinki. The study was approved by the Ethics Committee of the Faculty of Social and Behavioural Sciences of Utrecht University (protocol number 22-0206) Consent to Participate All participants gave written informed consent. Consent for Publication The authors affirm that human research participants provided informed consent for publication of the videos available at https://osf.io/2q6f4/. Open Practices Statement Data files and example videos are available at https://osf.io/2q6f4/. The experiment and analyses were not preregistered., (© 2024. The Author(s).)
- Published
- 2024
- Full Text
- View/download PDF
173. When knowing the activity is not enough to predict gaze.
- Author
-
Ghiani A, Amelink D, Brenner E, Hooge ITC, and Hessels RS
- Subjects
- Humans, Male, Adult, Female, Young Adult, Eye Movements physiology, Visual Perception physiology, Walking physiology, Fixation, Ocular physiology
- Abstract
It is reasonable to assume that where people look in the world is largely determined by what they are doing. The reasoning is that the activity determines where it is useful to look at each moment in time. Assuming that it is vital to accurately judge the positions of the steps when navigating a staircase, it is surprising that people differ a lot in the extent to which they look at the steps. Apparently, some people consider the accuracy of peripheral vision, predictability of the step size, and feeling the edges of the steps with their feet to be good enough. If so, occluding part of the view of the staircase and making it more important to place one's feet gently might make it more beneficial to look directly at the steps before stepping onto them, so that people will more consistently look at many steps. We tested this idea by asking people to walk on staircases, either with or without a tray with two cups of water on it. When carrying the tray, people walked more slowly, but they shifted their gaze across steps in much the same way as they did when walking without the tray. They did not look at more steps. There was a clear positive correlation between the fraction of steps that people looked at when walking with and without the tray. Thus, the variability in the extent to which people look at the steps persists when one makes walking on the staircase more challenging.
- Published
- 2024
- Full Text
- View/download PDF
174. A field test of computer-vision-based gaze estimation in psychology.
- Author
-
Valtakari NV, Hessels RS, Niehorster DC, Viktorsson C, Nyström P, Falck-Ytter T, Kemner C, and Hooge ITC
- Subjects
- Adult, Humans, Eye, Calibration, Video Recording, Eye Movements, Vision, Ocular
- Abstract
Computer-vision-based gaze estimation refers to techniques that estimate gaze direction directly from video recordings of the eyes or face without the need for an eye tracker. Although many such methods exist, their validation is often found in the technical literature (e.g., computer science conference papers). We aimed to (1) identify which computer-vision-based gaze estimation methods are usable by the average researcher in fields such as psychology or education, and (2) evaluate these methods. We searched for methods that do not require calibration and have clear documentation. Two toolkits, OpenFace and OpenGaze, were found to fulfill these criteria. First, we present an experiment where adult participants fixated on nine stimulus points on a computer screen. We filmed their face with a camera and processed the recorded videos with OpenFace and OpenGaze. We conclude that OpenGaze is accurate and precise enough to be used in screen-based experiments with stimuli separated by at least 11 degrees of gaze angle. OpenFace was not sufficiently accurate for such situations but can potentially be used in sparser environments. We then examined whether OpenFace could be used with horizontally separated stimuli in a sparse environment with infant participants. We compared dwell measures based on OpenFace estimates to the same measures based on manual coding. We conclude that OpenFace gaze estimates may potentially be used with measures such as relative total dwell time to sparse, horizontally separated areas of interest, but should not be used to draw conclusions about measures such as dwell duration., (© 2023. The Author(s).)
- Published
- 2024
- Full Text
- View/download PDF
175. GlassesValidator: A data quality tool for eye tracking glasses.
- Author
-
Niehorster DC, Hessels RS, Benjamins JS, Nyström M, and Hooge ITC
- Subjects
- Humans, Software, Data Accuracy, Eye-Tracking Technology
- Abstract
According to the proposal for a minimum reporting guideline for an eye tracking study by Holmqvist et al. (2022), the accuracy (in degrees) of eye tracking data should be reported. Currently, there is no easy way to determine accuracy for wearable eye tracking recordings. To enable determining the accuracy quickly and easily, we have produced a simple validation procedure using a printable poster and accompanying Python software. We tested the poster and procedure with 61 participants using one wearable eye tracker. In addition, the software was tested with six different wearable eye trackers. We found that the validation procedure can be administered within a minute per participant and provides measures of accuracy and precision. Calculating the eye-tracking data quality measures can be done offline on a simple computer and requires no advanced computer skills., (© 2023. The Author(s).)
- Published
- 2024
- Full Text
- View/download PDF
176. Representative design: A realistic alternative to (systematic) integrative design.
- Author
-
Holleman GA, Dhami MK, Hooge ITC, and Hessels RS
- Abstract
We disagree with Almaatouq et al. that no realistic alternative exists to the "one-at-a-time" paradigm. Seventy years ago, Egon Brunswik introduced representative design , which offers a clear path to commensurability and generality. Almaatouq et al.'s integrative design cannot guarantee the external validity and generalizability of results which is sorely needed, while representative design tackles the problem head on.
- Published
- 2024
- Full Text
- View/download PDF
177. Retraction Note: Eye tracking: empirical foundations for a minimal reporting guideline.
- Author
-
Holmqvist K, Örbom SL, Hooge ITC, Niehorster DC, Alexander RG, Andersson R, Benjamins JS, Blignaut P, Brouwer AM, Chuang LL, Dalrymple KA, Drieghe D, Dunn MJ, Ettinger U, Fiedler S, Foulsham T, van der Geest JN, Hansen DW, Hutton SB, Kasneci E, Kingstone A, Knox PC, Kok EM, Lee H, Lee JY, Leppänen JM, Macknik S, Majaranta P, Martinez-Conde S, Nuthmann A, Nyström M, Orquin JL, Otero-Millan J, Park SY, Popelka S, Proudlock F, Renkewitz F, Roorda A, Schulte-Mecklenbeck M, Sharif B, Shic F, Shovman M, Thomas MG, Venrooij W, Zemblys R, and Hessels RS
- Published
- 2024
- Full Text
- View/download PDF
178. Eye tracking: empirical foundations for a minimal reporting guideline.
- Author
-
Holmqvist K, Örbom SL, Hooge ITC, Niehorster DC, Alexander RG, Andersson R, Benjamins JS, Blignaut P, Brouwer AM, Chuang LL, Dalrymple KA, Drieghe D, Dunn MJ, Ettinger U, Fiedler S, Foulsham T, van der Geest JN, Hansen DW, Hutton SB, Kasneci E, Kingstone A, Knox PC, Kok EM, Lee H, Lee JY, Leppänen JM, Macknik S, Majaranta P, Martinez-Conde S, Nuthmann A, Nyström M, Orquin JL, Otero-Millan J, Park SY, Popelka S, Proudlock F, Renkewitz F, Roorda A, Schulte-Mecklenbeck M, Sharif B, Shic F, Shovman M, Thomas MG, Venrooij W, Zemblys R, and Hessels RS
- Subjects
- Humans, Empirical Research, Eye-Tracking Technology, Eye Movements
- Abstract
In this paper, we present a review of how the various aspects of any study using an eye tracker (such as the instrument, methodology, environment, participant, etc.) affect the quality of the recorded eye-tracking data and the obtained eye-movement and gaze measures. We take this review to represent the empirical foundation for reporting guidelines of any study involving an eye tracker. We compare this empirical foundation to five existing reporting guidelines and to a database of 207 published eye-tracking studies. We find that reporting guidelines vary substantially and do not match with actual reporting practices. We end by deriving a minimal, flexible reporting guideline based on empirical research (Section "An empirically based minimal reporting guideline")., (© 2022. The Author(s).)
- Published
- 2023
- Full Text
- View/download PDF
179. Eye contact avoidance in crowds: A large wearable eye-tracking study.
- Author
-
Hessels RS, Benjamins JS, Niehorster DC, van Doorn AJ, Koenderink JJ, Holleman GA, de Kloe YJR, Valtakari NV, van Hal S, and Hooge ITC
- Subjects
- Humans, Crowding, Walking, Eye, Fixation, Ocular, Eye-Tracking Technology, Wearable Electronic Devices
- Abstract
Eye contact is essential for human interactions. We investigated whether humans are able to avoid eye contact while navigating crowds. At a science festival, we fitted 62 participants with a wearable eye tracker and instructed them to walk a route. Half of the participants were further instructed to avoid eye contact. We report that humans can flexibly allocate their gaze while navigating crowds and avoid eye contact primarily by orienting their head and eyes towards the floor. We discuss implications for crowd navigation and gaze behavior. In addition, we address a number of issues encountered in such field studies with regard to data quality, control of the environment, and participant adherence to instructions. We stress that methodological innovation and scientific progress are strongly interrelated., (© 2022. The Author(s).)
- Published
- 2022
- Full Text
- View/download PDF
180. Perception of the Potential for Interaction in Social Scenes.
- Author
-
Hessels RS, Benjamins JS, van Doorn AJ, Koenderink JJ, and Hooge ITC
- Abstract
In urban environments, humans often encounter other people that may engage one in interaction. How do humans perceive such invitations to interact at a glance? We briefly presented participants with pictures of actors carrying out one of 11 behaviors (e.g., waving or looking at a phone) at four camera-actor distances. Participants were asked to describe what they might do in such a situation, how they decided, and what stood out most in the photograph. In addition, participants rated how likely they deemed interaction to take place. Participants formulated clear responses about how they might act. We show convincingly that what participants would do depended on the depicted behavior, but not the camera-actor distance. The likeliness to interact ratings depended both on the depicted behavior and the camera-actor distance. We conclude that humans perceive the "gist" of photographs and that various aspects of the actor, action, and context depicted in photographs are subjectively available at a glance. Our conclusions are discussed in the context of scene perception, social robotics, and intercultural differences., Competing Interests: Declaration of Conflicting Interests: The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article., (© The Author(s) 2021.)
- Published
- 2021
- Full Text
- View/download PDF
181. How does gaze to faces support face-to-face interaction? A review and perspective.
- Author
-
Hessels RS
- Subjects
- Humans, Facial Recognition physiology, Fixation, Ocular physiology, Social Interaction, Social Perception
- Abstract
Gaze-where one looks, how long, and when-plays an essential part in human social behavior. While many aspects of social gaze have been reviewed, there is no comprehensive review or theoretical framework that describes how gaze to faces supports face-to-face interaction. In this review, I address the following questions: (1) When does gaze need to be allocated to a particular region of a face in order to provide the relevant information for successful interaction; (2) How do humans look at other people, and faces in particular, regardless of whether gaze needs to be directed at a particular region to acquire the relevant visual information; (3) How does gaze support the regulation of interaction? The work reviewed spans psychophysical research, observational research, and eye-tracking research in both lab-based and interactive contexts. Based on the literature overview, I sketch a framework for future research based on dynamic systems theory. The framework holds that gaze should be investigated in relation to sub-states of the interaction, encompassing sub-states of the interactors, the content of the interaction as well as the interactive context. The relevant sub-states for understanding gaze in interaction vary over different timescales from microgenesis to ontogenesis and phylogenesis. The framework has important implications for vision science, psychopathology, developmental science, and social robotics.
- Published
- 2020
- Full Text
- View/download PDF
182. Task-related gaze control in human crowd navigation.
- Author
-
Hessels RS, van Doorn AJ, Benjamins JS, Holleman GA, and Hooge ITC
- Subjects
- Fixation, Ocular, Humans, Walking, Crowding, Visual Perception
- Abstract
Human crowds provide an interesting case for research on the perception of people. In this study, we investigate how visual information is acquired for (1) navigating human crowds and (2) seeking out social affordances in crowds by studying gaze behavior during human crowd navigation under different task instructions. Observers (n = 11) wore head-mounted eye-tracking glasses and walked two rounds through hallways containing walking crowds (n = 38) and static objects. For round one, observers were instructed to avoid collisions. For round two, observers furthermore had to indicate with a button press whether oncoming people made eye contact. Task performance (walking speed, absence of collisions) was similar across rounds. Fixation durations indicated that heads, bodies, objects, and walls maintained gaze comparably long. Only crowds in the distance maintained gaze relatively longer. We find no compelling evidence that human bodies and heads hold one's gaze more than objects while navigating crowds. When eye contact was assessed, heads were fixated more often and for a total longer duration, which came at the cost of looking at bodies. We conclude that gaze behavior in crowd navigation is task-dependent, and that not every fixation is strictly necessary for navigating crowds. When explicitly tasked with seeking out potential social affordances, gaze is modulated as a result. We discuss our findings in the light of current theories and models of gaze behavior. Furthermore, we show that in a head-mounted eye-tracking study, a large degree of experimental control can be maintained while many degrees of freedom on the side of the observer remain.
- Published
- 2020
- Full Text
- View/download PDF
183. GlassesViewer: Open-source software for viewing and analyzing data from the Tobii Pro Glasses 2 eye tracker.
- Author
-
Niehorster DC, Hessels RS, and Benjamins JS
- Subjects
- Eye, Head, Algorithms, Software
- Abstract
We present GlassesViewer, open-source software for viewing and analyzing eye-tracking data of the Tobii Pro Glasses 2 head-mounted eye tracker as well as the scene and eye videos and other data streams (pupil size, gyroscope, accelerometer, and TTL input) that this headset can record. The software provides the following functionality written in MATLAB: (1) a graphical interface for navigating the study- and recording structure produced by the Tobii Glasses 2; (2) functionality to unpack, parse, and synchronize the various data and video streams comprising a Glasses 2 recording; and (3) a graphical interface for viewing the Glasses 2's gaze direction, pupil size, gyroscope and accelerometer time-series data, along with the recorded scene and eye camera videos. In this latter interface, segments of data can furthermore be labeled through user-provided event classification algorithms or by means of manual annotation. Lastly, the toolbox provides integration with the GazeCode tool by Benjamins et al. (2018), enabling a completely open-source workflow for analyzing Tobii Pro Glasses 2 recordings.
- Published
- 2020
- Full Text
- View/download PDF
184. Do pupil-based binocular video eye trackers reliably measure vergence?
- Author
-
Hooge ITC, Hessels RS, and Nyström M
- Subjects
- Adult, Female, Humans, Light, Male, Middle Aged, Reflex, Pupillary radiation effects, Video Recording, Vision Disparity physiology, Young Adult, Convergence, Ocular physiology, Eye Movements physiology, Pupil physiology, Vision, Binocular physiology
- Abstract
A binocular eye tracker needs to be accurate to enable the determination of vergence, distance to the binocular fixation point and fixation disparity. These measures are useful in e.g. the research fields of visual perception, binocular control in reading and attention in 3D. Are binocular pupil-based video eye trackers accurate enough to produce meaningful binocular measures? Recent research revealed potentially large idiosyncratic systematic errors due to pupil-size changes. With a top of the line eye tracker (SR Research EyeLink 1000 plus), we investigated whether the pupil-size artefact in the separate eyes may cause the eye tracker to report apparent vergence when the eyeballs do not rotate. Participants were asked to fixate a target at a distance of 77 cm for 160 s. We evoked pupil-size changes by varying the light intensity. With increasing pupil size, horizontal vergence reported by the eye tracker decreased in most subjects, up to two degrees. However, this was not due to a rotation of the eyeballs, as identified from the absence of systematic movement in the corneal reflection (CR) signals. From this, we conclude that binocular pupil-CR or pupil-only video eye trackers using the dark pupil technique are not accurate enough to be used to determine vergence, distance to the binocular fixation point and fixation disparity., (Copyright © 2019 Elsevier Ltd. All rights reserved.)
- Published
- 2019
- Full Text
- View/download PDF
185. Gaze allocation in face-to-face communication is affected primarily by task structure and social context, not stimulus-driven factors.
- Author
-
Hessels RS, Holleman GA, Kingstone A, Hooge ITC, and Kemner C
- Subjects
- Adult, Female, Fixation, Ocular physiology, Humans, Male, Social Environment, Social Perception, Young Adult, Attention physiology, Communication, Eye Movements physiology, Interpersonal Relations, Social Behavior
- Abstract
Gaze allocation to human faces has recently been shown to be greatly dependent on the social context. However, what has not been considered explicitly here, is how gaze allocation may be supportive of the specific task that individuals carry out. In the present study, we combined these two insights. We investigated (1) how gaze allocation to facial features in face-to-face communication is dependent on the task-structure and (2) how gaze allocation to facial features is dependent on the gaze behavior of an interacting partner. To this end, participants and a confederate were asked to converse, while their eye movements were monitored using a state-of-the-art dual eye-tracking system. This system is unique in that participants can look each other directly in the eyes. We report that gaze allocation depends on the sub-task being carried out (speaking vs. listening). Moreover, we show that a confederate's gaze shift away from the participants affects their gaze allocation more than a gaze shift towards them. In a second experiment, we show that this gaze-guidance effect is not primarily stimulus-driven. We assert that gaze guidance elicited by the confederate looking away is related to the participants' sub-task of monitoring the confederate for when they can begin speaking. This study exemplifies the importance of both task structure and social context for gaze allocation during face-to-face communication., (Copyright © 2018 Elsevier B.V. All rights reserved.)
- Published
- 2019
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.