DynamicRead: exploring robust gaze interaction methods for reading on handheld mobile devices under dynamic conditions

Yaxiong Lei*, Yuheng Wang*, Tyler Caslin, Alexander Wisowaty, Xu Zhu, Mohamed Khamis*, Juan Ye*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

12 Downloads (Pure)

Abstract

Enabling gaze interaction in real-time on handheld mobile devices has attracted significant attention in recent years. An increasing number of research projects have focused on sophisticated appearance-based deep learning models to enhance the precision of gaze estimation on smartphones. This inspires important research questions, including how the gaze can be used in a real-time application, and what type of gaze interaction methods are preferable under dynamic conditions in terms of both user acceptance and delivering reliable performance. To address these questions, we design four types of gaze scrolling techniques: three explicit technique based on Gaze Gesture, Dwell time, and Pursuit; and one implicit technique based on reading speed to support touch-free, page-scrolling on a reading application. We conduct a 20-participant user study under both sitting and walking settings and our results reveal that Gaze Gesture and Dwell time-based interfaces are more robust while walking and Gaze Gesture has achieved consistently good scores on usability while not causing high cognitive workload.
Original languageEnglish
Article number158
Number of pages17
JournalProceedings of the ACM on Human-Computer Interaction
Volume7
Issue numberETRA
DOIs
Publication statusPublished - 18 May 2023
EventThe 2023 ACM Symposium on Eye Tracking Research & Applications (ETRA) - Tübingen, Germany, Tübingen, Germany
Duration: 29 May 20232 Jun 2023
https://etra.acm.org/2023/index.html

Keywords

  • Eye Tracking
  • Mobile devices
  • Smartphones
  • Gaze-based Interaction
  • Dwell
  • Pursuit
  • Gaze Gesture
  • Scrolling Techniques
  • Reading

Fingerprint

Dive into the research topics of 'DynamicRead: exploring robust gaze interaction methods for reading on handheld mobile devices under dynamic conditions'. Together they form a unique fingerprint.

Cite this