Peer-learning that engages students in multiple choice question (MCQ) formulation promotes higher task engagement and deeper learning than simply answering MCQ’s in summative assessment. Yet presently, the literature detailing deployments of student authored MCQ software is biased towards accounts from Science, Technology, Engineering, Maths and Medicine (STEMM) subjects, rather than discursive subjects or disciplines where content may contain fewer absolute facts and objective metrics and more nuance. We report on qualitative and quantitative findings from a semester-long deployment of a peer-learning software package (PeerWise) in a 140-student course on Interaction Design. PeerWise enables students to author, rate and comment upon their peers’ MCQ questions. The platform was enthusiastically adopted as a revision aid, yet overall question quality was poor and students expressed difficulty in translating the discursive nature of the course content into MCQs with only one correct answer. In addressing these shortcomings, this paper offers specific recommendations to instructors of more discursive subjects using student-led MCQ authoring platforms, and further, how platforms such as PeerWise may be adapted to better suit disciplines characterised by discursive content. We propose alternative approaches to moderation and two suggestions for potential amendments to the software itself.