Quantifying the impact of motion on 2D gaze estimation in real-world mobile interactions

Yaxiong Lei, Yuheng Wang, Fergus Buchanan, Mingyue Zhao, Yusuke Sugano, Shijing He, Mohamed Khamis, Juan Ye

Research output: Working paperPreprint

Abstract

Mobile gaze tracking involves inferring a user's gaze point or direction on a mobile device's screen from facial images captured by the device's front camera. While this technology inspires an increasing number of gaze-interaction applications, achieving consistent accuracy remains challenging due to dynamic user-device spatial relationships and varied motion conditions inherent in mobile contexts. This paper provides empirical evidence on how user mobility and behaviour affect mobile gaze tracking accuracy. We conduct two user studies collecting behaviour and gaze data under various motion conditions - from lying to maze navigation - and during different interaction tasks. Quantitative analysis has revealed behavioural regularities among daily tasks and identified head distance, head pose, and device orientation as key factors affecting accuracy, with errors increasing by up to 48.91% in dynamic conditions compared to static ones. These findings highlight the need for more robust, adaptive eye-tracking systems that account for head movements and device deflection to maintain accuracy across diverse mobile contexts.
Original languageEnglish
PublisherarXiv
Pages1-27
Number of pages27
Publication statusSubmitted - 14 Feb 2025

Keywords

  • Mobile eye tracking
  • 2D gaze estimation
  • Calibration
  • Mobile devices
  • User studies

Fingerprint

Dive into the research topics of 'Quantifying the impact of motion on 2D gaze estimation in real-world mobile interactions'. Together they form a unique fingerprint.

Cite this