ConvNTM: Conversational Neural Topic Model
Keywords:SNLP: Conversational AI/Dialogue Systems, SNLP: Text Mining
AbstractTopic models have been thoroughly investigated for multiple years due to their great potential in analyzing and understanding texts. Recently, researchers combine the study of topic models with deep learning techniques, known as Neural Topic Models (NTMs). However, existing NTMs are mainly tested based on general document modeling without considering different textual analysis scenarios. We assume that there are different characteristics to model topics in different textual analysis tasks. In this paper, we propose a Conversational Neural Topic Model (ConvNTM) designed in particular for the conversational scenario. Unlike the general document topic modeling, a conversation session lasts for multiple turns: each short-text utterance complies with a single topic distribution and these topic distributions are dependent across turns. Moreover, there are roles in conversations, a.k.a., speakers and addressees. Topic distributions are partially determined by such roles in conversations. We take these factors into account to model topics in conversations via the multi-turn and multi-role formulation. We also leverage the word co-occurrence relationship as a new training objective to further improve topic quality. Comprehensive experimental results based on the benchmark datasets demonstrate that our proposed ConvNTM achieves the best performance both in topic modeling and in typical downstream tasks within conversational research (i.e., dialogue act classification and dialogue response generation).
How to Cite
Sun, H., Tu, Q., Li, J., & Yan, R. (2023). ConvNTM: Conversational Neural Topic Model. Proceedings of the AAAI Conference on Artificial Intelligence, 37(11), 13609-13617. https://doi.org/10.1609/aaai.v37i11.26595
AAAI Technical Track on Speech & Natural Language Processing