A topicality-driven QUD model for discourse processing

Yingxue Fu*, Mark Jan Nederhof, Anaïs Ollagnier

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Question Under Discussion (QUD) is a discourse framework that has attracted growing interest in NLP in recent years. Among existing QUD models, the QUD tree approach (Riester, 2019) focuses on reconstructing QUDs and their hierarchical relationships, using a single tree to represent discourse structure. Prior implementation shows moderate inter-annotator agreement, highlighting the challenging nature of this task. In this paper, we propose a new QUD model for annotating hierarchical discourse structure. Our annotation achieves high inter-annotator agreement: 81.45% for short files and 79.53% for long files of Wall Street Journal articles. We show preliminary results on using GPT-4 for automatic annotation, which suggests that one of the best-performing LLMs still struggles with capturing hierarchical discourse structure. Moreover, we compare the annotations with RST annotations. Lastly, we present an approach for integrating hierarchical and local discourse relation annotations with the proposed model.
Original languageEnglish
Title of host publicationProceedings of the 26th annual meeting of the Special Interest Group on Discourse and Dialogue (SIGDIAL’25)
Place of PublicationAvignon University, France
PublisherAssociation for Computational Linguistics
Number of pages17
Publication statusAccepted/In press - 23 Jun 2025
EventResearch in Dialogue and Discourse (SIGDIAL'25) - Avignon University, Avignon, France
Duration: 25 Aug 202527 Aug 2025
https://2025.sigdial.org/

Conference

ConferenceResearch in Dialogue and Discourse (SIGDIAL'25)
Country/TerritoryFrance
CityAvignon
Period25/08/2527/08/25
Internet address

Fingerprint

Dive into the research topics of 'A topicality-driven QUD model for discourse processing'. Together they form a unique fingerprint.

Cite this