Back to Search Start Over

Perceptual self-position estimation based on gaze tracking in virtual reality

Authors :
Hongmei Liu
Huabiao Qin
Source :
Virtual Reality. 26:269-278
Publication Year :
2021
Publisher :
Springer Science and Business Media LLC, 2021.

Abstract

The depth perception of human visual system is divergent between virtual and real space; this depth discrepancy affects the spatial judgment of the user in a virtual space, which means the user cannot precisely locate their self-position in a virtual space. Existing localization methods ignore the depth discrepancy and only concentrate on increasing location accuracy in real space. Thus, the discrepancy always exists in virtual space, which induces visual discomfort. In this paper, a localization method based on depth perception is proposed to measure the self-position of the user in a virtual environment. Using binocular gaze tracking, this method estimates perceived depth and constructs an eye matrix by measuring gaze convergence on a target. Comparing the eye matrix and camera matrix, the method can automatically calculate the actual depth of the viewed target. Then, the difference between the actual depth and the perceived depth can be explicitly estimated without markers. The position of the virtual camera is compensated by the depth difference to obtain perceptual self-position. Furthermore, a virtual reality system is redesigned by adjusting the virtual camera position. The redesigned system makes users feel that the distance (from the user to an object) is the same in virtual and real space. Experimental results demonstrate that the redesigned system can improve the user’s visual experiences, which validate the superiority of the proposed localization method.

Details

ISSN :
14349957 and 13594338
Volume :
26
Database :
OpenAIRE
Journal :
Virtual Reality
Accession number :
edsair.doi...........daa318f5f17c381086cac6743251ff03