1. SalientGaze: Saliency-based gaze correction in virtual reality
- Author
-
Elmar Eisemann, Peiteng Shi, and Markus Billeter
- Subjects
Computer science ,business.industry ,Headset ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,General Engineering ,020207 software engineering ,Ranging ,02 engineering and technology ,Virtual reality ,computer.software_genre ,Computer Graphics and Computer-Aided Design ,Gaze ,Rendering (computer graphics) ,Human-Computer Interaction ,Salient ,Virtual machine ,0202 electrical engineering, electronic engineering, information engineering ,Eye tracking ,020201 artificial intelligence & image processing ,Computer vision ,Artificial intelligence ,business ,computer - Abstract
Eye-tracking with gaze estimation is a key element in many applications, ranging from foveated rendering and user interaction to behavioural analysis and usage metrics. For virtual reality, eye-tracking typically relies on near-eye cameras that are mounted in the VR headset. Such methods usually involve an initial calibration to create a mapping from eye features to a gaze position. However, the accuracy based on the initial calibration degrades when the position of the headset relative to the users’ head changes; this is especially noticeable when users readjust the headset for comfort or even completely remove it for a short while. We show that a correction of such shifts can be achieved via 2D drift vectors in eye space. Our method estimates these drifts by extracting salient cues from the shown virtual environment to determine potential gaze directions. Our solution can compensate for HMD shifts, even those arising from taking off the headset, which enables us to eliminate reinitialization steps.
- Published
- 2020
- Full Text
- View/download PDF