Description
The DynamicRead dataset contains eye movement data from 20 participants engaged in reading tasks on a handheld mobile device, while sitting and walking. Participants read 10 texts using four gaze interaction methods based on scrolling techniques, as well as one touch-based interaction method. The dataset captures the impact of motion on eye movement and features an unstable eye-movement sampling frequency ranging from 8-12fps/hz, which poses a challenge for traditional eye-movement toolkits. This dataset can be valuable for academic research on the impacts of motion on eye movement and the development of robust gaze interaction methods for handheld mobile devices under dynamic conditions. ### Folder Structure: -'eye_data' -- contains eye movement of reading data -'text_screenshot' -- contains different reading material screenshot -'heatmap_img' -- contains heat map of each reading page -'scanpath_img' -- contains scan path of each reading page -'text' -- contains reading material -'code' -- contains R code for reading eye movement visualisation Name Method - GAP1_GazeA_log.txt -- Group A, Participant No1, Scrolling technique: GazeA - Gaze A: Auto-scrolling - Gaze B: Heatbox - Gaze C: Eye-Swipe - Gaze D: Moving-bar
| Date made available | 31 May 2023 |
|---|---|
| Publisher | Zenodo |
| Date of data production | Apr 2023 |
Keywords
- Eye Movement
- Dynamic
- Mobile Devices
- Reading
Research output
- 1 Article
-
DynamicRead: exploring robust gaze interaction methods for reading on handheld mobile devices under dynamic conditions
Lei, Y., Wang, Y., Caslin, T., Wisowaty, A., Zhu, X., Khamis, M. & Ye, J., 18 May 2023, In: Proceedings of the ACM on Human-Computer Interaction. 7, ETRA, 17 p., 158.Research output: Contribution to journal › Article › peer-review
Open AccessFile
Datasets
-
DynamicRead: Eye Movement Data of Reading on Handheld Mobile Devices under Dynamic Conditions
Lei, Y. (Creator), Wang, Y. (Creator), Khamis, M. (Creator) & Ye, J. (Creator), Zenodo, 31 May 2023
Dataset