Back to Search
Start Over
OpenEDS2020 Challenge on Gaze Tracking for VR: Dataset and Results.
- Source :
-
Sensors (14248220) . Jul2021, Vol. 21 Issue 14, p4769-4769. 1p. - Publication Year :
- 2021
-
Abstract
- This paper summarizes the OpenEDS 2020 Challenge dataset, the proposed baselines, and results obtained by the top three winners of each competition: (1) Gaze prediction Challenge, with the goal of predicting the gaze vector 1 to 5 frames into the future based on a sequence of previous eye images, and (2) Sparse Temporal Semantic Segmentation Challenge, with the goal of using temporal information to propagate semantic eye labels to contiguous eye image frames. Both competitions were based on the OpenEDS2020 dataset, a novel dataset of eye-image sequences captured at a frame rate of 100 Hz under controlled illumination, using a virtual-reality head-mounted display with two synchronized eye-facing cameras. The dataset, which we make publicly available for the research community, consists of 87 subjects performing several gaze-elicited tasks, and is divided into 2 subsets, one for each competition task. The proposed baselines, based on deep learning approaches, obtained an average angular error of 5.37 degrees for gaze prediction, and a mean intersection over union score (mIoU) of 84.1% for semantic segmentation. The winning solutions were able to outperform the baselines, obtaining up to 3.17 degrees for the former task and 95.2% mIoU for the latter. [ABSTRACT FROM AUTHOR]
- Subjects :
- *EYE tracking
*DEEP learning
*HEAD-mounted displays
*SCIENTIFIC community
*GAZE
Subjects
Details
- Language :
- English
- ISSN :
- 14248220
- Volume :
- 21
- Issue :
- 14
- Database :
- Academic Search Index
- Journal :
- Sensors (14248220)
- Publication Type :
- Academic Journal
- Accession number :
- 151610917
- Full Text :
- https://doi.org/10.3390/s21144769