301. Eye-wearable head-mounted tracking and gaze estimation interactive machine system for human–machine interface
- Author
-
Chao-Wei Yu, Ko-Feng Lee, Chen-Wei Hung, Chih-Bo Wen, Cheng-Lung Jen, Yen-Lin Chen, and Kai-Yi Chin
- Subjects
Acoustics and Ultrasonics ,Machine vision ,Computer science ,Headset ,0206 medical engineering ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,lcsh:Control engineering systems. Automatic machinery (General) ,lcsh:QC221-246 ,Wearable computer ,02 engineering and technology ,Tracking (particle physics) ,01 natural sciences ,lcsh:TJ212-225 ,Point (geometry) ,Computer vision ,Civil and Structural Engineering ,business.industry ,Mechanical Engineering ,Track (disk drive) ,010401 analytical chemistry ,Building and Construction ,020601 biomedical engineering ,Gaze ,0104 chemical sciences ,Geophysics ,Mechanics of Materials ,lcsh:Acoustics. Sound ,Eye tracking ,Artificial intelligence ,business - Abstract
In this study, a head-mounted camera was used to track eye behaviors and estimate the gaze point on the user’s visual plane. The integration of the elastic mechanism design makes the headset adaptable for various users. The wearable cases were prototyped with low-cost cameras to produce an efficient eye tracking solution. This proposed system can effectively extract and estimate pupil ellipse from a few camera images of an eye and compute the corresponding three-dimensional eye model. The system can match later images of the same pupil ellipse from a head-mounted camera to give the possible visual angles. To estimate the gaze point, the system uses multiple-point calibration to solve the related polynomial formula for future angle-to-gaze mapping. The proposed eye-tracking algorithms can provide a low-complexity solution with high accuracy, precision, and speed. This tracking system is a low-cost and promising system that can be used in headsets for virtual reality, auxiliary equipment, interactive machine, and human–machine interface applications. The proposed eye-tracking algorithm can achieve satisfactory performance without using a high-end high-speed camera and can be detected under different lighting sources, and the average errors of the detection results are stably within 9 pixels and at a distance of 50 cm from the screen; while the average error of the fixation mapping results is within 3°.
- Published
- 2021