Complementing text entry evaluations with a composition task

Keith Vertanen, Per Ola Kristensson

Research output: Contribution to journalArticlepeer-review

34 Citations (Scopus)

Abstract

A common methodology for evaluating text entry methods is to ask participants to transcribe a predefined set of memorable sentences or phrases. In this article, we explore if we can complement the conventional transcription task with a more externally valid composition task. In a series of large-scale crowdsourced experiments, we found that participants could consistently and rapidly invent high quality and creative compositions with only modest reductions in entry rates. Based on our series of experiments, we provide a best-practice procedure for using composition tasks in text entry evaluations. This includes a judging protocol which can be performed either by the experimenters or by crowdsourced workers on a microtask market. We evaluated our composition task procedure using a text entry method unfamiliar to participants. Our empirical results show that the composition task can serve as a valid complementary text entry evaluation method.
Original languageEnglish
Article number8
Number of pages33
JournalACM Transactions on Computer-Human Interaction
Volume21
Issue number2
DOIs
Publication statusPublished - Feb 2014

Keywords

  • Text entry evaluation
  • Composition
  • Transcription
  • Crowdsourcing

Fingerprint

Dive into the research topics of 'Complementing text entry evaluations with a composition task'. Together they form a unique fingerprint.

Cite this