Back to Search
Start Over
Implicit gaze research for XR systems
- Publication Year :
- 2024
-
Abstract
- Although eye-tracking technology is being integrated into more VR and MR headsets, the true potential of eye tracking in enhancing user interactions within XR settings remains relatively untapped. Presently, one of the most prevalent gaze applications in XR is input control; for example, using gaze to control a cursor for pointing. However, our eyes evolved primarily for sensory input and understanding of the world around us, and yet few XR applications have leveraged natural gaze behavior to infer and support users' intent and cognitive states. Systems that can represent a user's context and interaction intent can better support the user by generating contextually relevant content, by making the user interface easier to use, by highlighting potential errors, and more. This mode of application is not fully taken advantage of in current commercially available XR systems and yet it is likely where we'll find paradigm-shifting use cases for eye tracking. In this paper, we elucidate the state-of-the-art applications for eye tracking and propose new research directions to harness its potential fully.<br />Comment: PhysioCHI: Towards Best Practices for Integrating Physiological Signals in HCI, May 11, 2024, Honolulu, HI, USA
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2405.13878
- Document Type :
- Working Paper