Abstract
Human-Computer Interaction (HCI) is a research area which studies how people interact with computer systems. Because of its multidisciplinary nature, HCI modules often sit at unease within the computer science curriculum which is primarily composed by modules typically assessed through objective measures, using quantitative methods. Assessment criteria of HCI topics need to make some subjective measures quantifiable (e.g. aesthetics and creativity). In the case of large classes, it is critical that the assessment can scale appropriately without compromising on the validity of the judgment of how well the learning outcomes have been achieved.
In the HCI module 'Interaction Design' at the University of Southampton, faced with increasing student numbers (from less than 80 to over 160 in two years), lecturers redesigned the assessment to provide timely feedback. The module is assessed by exam and coursework, where the exam includes a large section composed of multiple-choice questions (MCQs). In order to foster higher-order learning, students were encouraged to author MCQs using the platform PeerWise, which proved to be used as a revision aid towards the exam.
In the coursework, students are required to conduct qualitative research, which in turns informs the creation of prototypes for technical solutions to problems from diverse areas of interest. Providing student such diversity of choices encourages creativity and freedom, as well as their application of the theoretical background of human-computer interaction.
This presentation explains the authors' approach to assessment, both in supporting the creation of MCQs and exam revision, as well as in how the medium of video allowed for the expression of creativity and application of knowledge, whilst allowing for considerable ease of marking compared with traditional alternatives, which allowed for the provision of timely feedback to students.
In the HCI module 'Interaction Design' at the University of Southampton, faced with increasing student numbers (from less than 80 to over 160 in two years), lecturers redesigned the assessment to provide timely feedback. The module is assessed by exam and coursework, where the exam includes a large section composed of multiple-choice questions (MCQs). In order to foster higher-order learning, students were encouraged to author MCQs using the platform PeerWise, which proved to be used as a revision aid towards the exam.
In the coursework, students are required to conduct qualitative research, which in turns informs the creation of prototypes for technical solutions to problems from diverse areas of interest. Providing student such diversity of choices encourages creativity and freedom, as well as their application of the theoretical background of human-computer interaction.
This presentation explains the authors' approach to assessment, both in supporting the creation of MCQs and exam revision, as well as in how the medium of video allowed for the expression of creativity and application of knowledge, whilst allowing for considerable ease of marking compared with traditional alternatives, which allowed for the provision of timely feedback to students.
Original language | English |
---|---|
Publication status | Published - 12 Jan 2018 |
Event | Computing Education Practice conference - Durham University, Durham, United Kingdom Duration: 11 Jan 2018 → 12 Jan 2018 Conference number: 2 http://community.dur.ac.uk/cep.conference/2018/programme.php#abstract13 |
Conference
Conference | Computing Education Practice conference |
---|---|
Abbreviated title | CEP |
Country/Territory | United Kingdom |
City | Durham |
Period | 11/01/18 → 12/01/18 |
Internet address |