User-defined interface gestures: dataset and analysis

Daniela Grijincu, Miguel Nacenta, Per Ola Kristensson

Research output: Chapter in Book/Report/Conference proceedingConference contribution

28 Citations (Scopus)


We present a video-based gesture dataset and a methodology for annotating video-based gesture datasets. Our dataset consists of user-defined gestures generated by 18 participants from a previous investigation of gesture memorability. We design and use a crowd-sourced classification task to annotate the videos. The results are made available through a web-based visualization that allows researchers and designers to explore the dataset. Finally, we perform an additional descriptive analysis and quantitative modeling exercise that provide additional insights into the results of the original study.
To facilitate the use of the presented methodology by other researchers we share the data, the source of the human intelligence tasks for crowdsourcing, a new taxonomy that integrates previous work, and the source code of the visualization tool.
Original languageEnglish
Title of host publicationProceedings of the 9th ACM International Conference on Interactive Tabletops and Surfaces (ITS 2014)
Place of PublicationNew York, NY
Number of pages10
ISBN (Electronic)9781450325875
Publication statusPublished - 16 Nov 2014


  • Gesture design
  • User-defined gestures
  • Gesture elicitation
  • Gesture analysis methodology
  • Gesture annotation
  • Gesture memorability
  • Gestures
  • Gesture datasets
  • Crowdsourcing


Dive into the research topics of 'User-defined interface gestures: dataset and analysis'. Together they form a unique fingerprint.

Cite this