Quantcast

Appearance-based gaze estimation under slight head motion

Research paper by Zhizhi Guo, Qianxiang Zhou, Zhongqi Liu

Indexed on: 09 Jan '16Published on: 09 Jan '16Published in: Multimedia Tools and Applications



Abstract

At present a lot of gaze estimation methods can get accurate result under ideal conditions, but some practical issues are still the biggest challenges affect the accuracy such as head motion and eye blinking. Improving the accuracy of gaze estimation and the tolerance of head motion are common tasks in the field of gaze estimation. Therefore, this paper aims to propose an accurate gaze estimation method without fixed head pose. The core problem is how to build the mapping relationship between image features and gaze position, and how to resist the head motion through the training samples. To this end, at first, a new input feature, which can well reflect the change of eye image features with different gaze positions, is proposed and it is based on appearance feature and distance feature. So the number of training samples in the process of calibration is significantly reduced. Then ℓ1-optimization is used to select an optimal set, which represents the mapping relationship between input feature and gaze position. At last, a linear equation is fitted to correct the initial estimation bias which is brought by head motion. In this paper, the experimental results demonstrate that our system achieves accurate result with one camera and a small number of calibration points. The accuracy of final gaze estimation is improved by 22 % through compensation equation. In addition, our system is robustness to eye blink and distance change.