Skip to main content

EyeSyn: Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition

Publication ,  Conference
Lan, G; Scargill, T; Gorlatova, M
Published in: Proceedings 21st ACM IEEE International Conference on Information Processing in Sensor Networks IPSN 2022
January 1, 2022

Recent advances in eye tracking have given birth to a new genre of gaze-based context sensing applications, ranging from cognitive load estimation to emotion recognition. To achieve state-of-the-art recognition accuracy, a large-scale, labeled eye movement dataset is needed to train deep learning-based classifiers. However, due to the heterogeneity in human visual behavior, as well as the labor-intensive and privacy-compromising data collection process, datasets for gaze-based activity recognition are scarce and hard to collect. To alleviate the sparse gaze data problem, we present EyeSyn, a novel suite of psychology-inspired generative models that leverages only publicly available images and videos to synthesize a realistic and arbitrarily large eye movement dataset. Taking gaze-based museum activity recognition as a case study, our evaluation demonstrates that EyeSyn can not only replicate the distinct pat-terns in the actual gaze signals that are captured by an eye tracking device, but also simulate the signal diversity that results from dif-ferent measurement setups and subject heterogeneity. Moreover, in the few-shot learning scenario, EyeSyn can be readily incorpo-rated with either transfer learning or meta-learning to achieve 90% accuracy, without the need for a large-scale dataset for training.

Duke Scholars

Published In

Proceedings 21st ACM IEEE International Conference on Information Processing in Sensor Networks IPSN 2022

DOI

Publication Date

January 1, 2022

Start / End Page

233 / 246
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Lan, G., Scargill, T., & Gorlatova, M. (2022). EyeSyn: Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition. In Proceedings 21st ACM IEEE International Conference on Information Processing in Sensor Networks IPSN 2022 (pp. 233–246). https://doi.org/10.1109/IPSN54338.2022.00026
Lan, G., T. Scargill, and M. Gorlatova. “EyeSyn: Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition.” In Proceedings 21st ACM IEEE International Conference on Information Processing in Sensor Networks IPSN 2022, 233–46, 2022. https://doi.org/10.1109/IPSN54338.2022.00026.
Lan G, Scargill T, Gorlatova M. EyeSyn: Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition. In: Proceedings 21st ACM IEEE International Conference on Information Processing in Sensor Networks IPSN 2022. 2022. p. 233–46.
Lan, G., et al. “EyeSyn: Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition.” Proceedings 21st ACM IEEE International Conference on Information Processing in Sensor Networks IPSN 2022, 2022, pp. 233–46. Scopus, doi:10.1109/IPSN54338.2022.00026.
Lan G, Scargill T, Gorlatova M. EyeSyn: Psychology-inspired Eye Movement Synthesis for Gaze-based Activity Recognition. Proceedings 21st ACM IEEE International Conference on Information Processing in Sensor Networks IPSN 2022. 2022. p. 233–246.

Published In

Proceedings 21st ACM IEEE International Conference on Information Processing in Sensor Networks IPSN 2022

DOI

Publication Date

January 1, 2022

Start / End Page

233 / 246