Multi-scale gestural interaction for augmented reality

Barrett Ens, Aaron John Quigley, Hui Shyong Yeo, Pourang Irani, Mark Billinghurst

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)
2 Downloads (Pure)

Abstract

We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and finger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor configuration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone.
Original languageEnglish
Title of host publicationSA '17 SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications
Place of PublicationNew York
PublisherACM
Number of pages2
ISBN (Electronic)9781450354103
DOIs
Publication statusPublished - 27 Nov 2017
Event10th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia - Bangkok, Thailand
Duration: 27 Nov 201730 Nov 2017
Conference number: 10
https://sa2017.siggraph.org/

Conference

Conference10th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia
Country/TerritoryThailand
CityBangkok
Period27/11/1730/11/17
Internet address

Keywords

  • Microgestures
  • Gesture interaction
  • Augmented reality

Fingerprint

Dive into the research topics of 'Multi-scale gestural interaction for augmented reality'. Together they form a unique fingerprint.

Cite this