Synthesis-based low-cost gaze analysis

Conference Paper

Gaze analysis has gained much popularity over the years due to its relevance in a wide array of applications, including humancomputer interaction, fatigue detection, and clinical mental health diagnosis. However, accurate gaze estimation from low resolution images outside of the lab (in the wild) still proves to be a challenging task. The new Intel low-cost RealSense 3D camera, capable of acquiring submillimeter resolution depth information, is currently available in laptops, and such technology is expected to become ubiquitous in other portable devices. In this paper, we focus on low-cost, scalable and real time analysis of human gaze using this RealSense camera. We exploit the direct measurement of eye surface geometry captured by the RGB-D camera, and perform gaze estimation through novel synthesis-based training and testing. Furthermore, we synthesize different eye movement appearances using a linear approach. From each 3D eye training sample captured by the RealSense camera, we synthesize multiple novel 2D views by varying the view angle to simulate head motions expected at testing. We then learn from the synthesized 2D eye images a gaze regression model using regression forests. At testing, for each captured RGB-D eye image, we first repeat the same synthesis process. For each synthesized image, we estimate the gaze from our gaze regression model, and factor-out the associated camera/head motion. In this way, we obtain multiple gaze estimations for each RGB-D eye image, and the consensus is adopted. We show that this synthesis-based training and testing significantly improves the precision in gaze estimation, opening the door to true low-cost solutions.

Full Text

Duke Authors

Cited Authors

  • Chang, Z; Qiu, Q; Sapiro, G

Published Date

  • January 1, 2016

Published In

Volume / Issue

  • 618 /

Start / End Page

  • 95 - 100

International Standard Serial Number (ISSN)

  • 1865-0929

International Standard Book Number 13 (ISBN-13)

  • 9783319405414

Digital Object Identifier (DOI)

  • 10.1007/978-3-319-40542-1_15

Citation Source

  • Scopus