Projects per year
Abstract
A common methodology for evaluating text entry methods is to ask participants to transcribe a predefined set of memorable sentences or phrases. In this article, we explore if we can complement the conventional transcription task with a more externally valid composition task. In a series of large-scale crowdsourced experiments, we found that participants could consistently and rapidly invent high quality and creative compositions with only modest reductions in entry rates. Based on our series of experiments, we provide a best-practice procedure for using composition tasks in text entry evaluations. This includes a judging protocol which can be performed either by the experimenters or by crowdsourced workers on a microtask market. We evaluated our composition task procedure using a text entry method unfamiliar to participants. Our empirical results show that the composition task can serve as a valid complementary text entry evaluation method.
Original language | English |
---|---|
Article number | 8 |
Number of pages | 33 |
Journal | ACM Transactions on Computer-Human Interaction |
Volume | 21 |
Issue number | 2 |
DOIs | |
Publication status | Published - Feb 2014 |
Keywords
- Text entry evaluation
- Composition
- Transcription
- Crowdsourcing
Fingerprint
Dive into the research topics of 'Complementing text entry evaluations with a composition task'. Together they form a unique fingerprint.Projects
- 1 Finished
-
Text Entry By Inference: Eye Typing, Ste: Text Entry by Inference: Eye Typing, Stenography, and Understanding Context of Use
Kristensson, P. O. (PI)
28/03/11 → 27/05/13
Project: Standard