PTGaze: Cross-domain gaze estimation via proxy tuning

Yafei Wang, Runze Yan, Yaxiong Lei, Xianping Fu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

To tackle the cross-domain model performance degradation challenge, a gaze estimation method based on proxy tuning is proposed, called PTGaze. In PTGaze, an base model is used to learn the gaze representation of the baseline model in the source domain, and an adapt model is used to learn the gaze representation of the baseline model in the target domain. The gaze difference between the base and adapt models is utilized to guide the final output, ensuring the method’s accuracy in the target domain. Experimental results show that the proposed method achieves higher cross-domain gaze estimation accuracy on five public datasets, using RT-Gene, Full-face, and Gaze360 as baseline models.
Original languageEnglish
Title of host publicationProceedings of the 2025 symposium on eye tracking research and applications (ETRA)
EditorsYusuke Sugano, Mohamed Khamis, Aladine Chetouani, Ludwig Sidenmark, Alessandro Bruno
Place of PublicationNew York
PublisherACM
Pages1-2
Number of pages2
ISBN (Print)9798400714870
DOIs
Publication statusPublished - 25 May 2025

Fingerprint

Dive into the research topics of 'PTGaze: Cross-domain gaze estimation via proxy tuning'. Together they form a unique fingerprint.

Cite this