SWAG demo: smart watch assisted gesture interaction for mixed reality head-mounted displays

Hyung-il Kim, Juyoung Lee, Hui Shyong Yeo, Aaron John Quigley, Woontack Woo

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

In this demonstration, we will show a prototype system with sensor fusion approach to robustly track 6 degrees of freedom of hand movement and support intuitive hand gesture interaction and 3D object manipulation for Mixed Reality head-mounted displays. Robust tracking of hand and finger with egocentric camera remains a challenging problem, especially with self-occlusion – for example, when user tries to grab a virtual object in midair by closing the palm. Our approach leverages the use of a common smart watch worn on the wrist to provide a more reliable palm and wrist orientation data, while fusing the data with camera to achieve robust hand motion and orientation for interaction.
Original languageEnglish
Title of host publicationAdjunct Proceedings - 2018 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages428-429
Number of pages2
ISBN (Electronic)9781538675922
DOIs
Publication statusPublished - 25 Apr 2019
Event17th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018 - Munich, Germany
Duration: 16 Oct 201820 Oct 2018
https://www.ismar2018.org/

Conference

Conference17th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018
Country/TerritoryGermany
CityMunich
Period16/10/1820/10/18
Internet address

Keywords

  • Augmented reality
  • Wearable computing
  • 3D user interfaces
  • Hand interaction
  • Virtual 3D object manipulation

Fingerprint

Dive into the research topics of 'SWAG demo: smart watch assisted gesture interaction for mixed reality head-mounted displays'. Together they form a unique fingerprint.

Cite this