Unsupervised Cross-Domain Rumor Detection with Contrastive Learning and Cross-Attention

Authors

  • Hongyan Ran School of Computer and Information Technology & Beijing Key Lab of Traffic Data Analysis and Mining
  • Caiyan Jia School of Computer and Information Technology & Beijing Key Lab of Traffic Data Analysis and Mining,

DOI:

https://doi.org/10.1609/aaai.v37i11.26584

Keywords:

SNLP: Text Mining, SNLP: Information Extraction, SNLP: Language Models, SNLP: Text Classification

Abstract

Massive rumors usually appear along with breaking news or trending topics, seriously hindering the truth. Existing rumor detection methods are mostly focused on the same domain, thus have poor performance in cross-domain scenarios due to domain shift. In this work, we propose an end-to-end instance-wise and prototype-wise contrastive learning model with cross-attention mechanism for cross-domain rumor detection. The model not only performs cross-domain feature alignment, but also enforces target samples to align with the corresponding prototypes of a given source domain. Since target labels in a target domain are unavailable, we use a clustering-based approach with carefully initialized centers by a batch of source domain samples to produce pseudo labels. Moreover, we use a cross-attention mechanism on a pair of source data and target data with the same labels to learn domain-invariant representations. Because the samples in a domain pair tend to express similar semantic patterns especially on the people’s attitudes (e.g., supporting or denying) towards the same category of rumors, the discrepancy between a pair of source domain and target domain will be decreased. We conduct experiments on four groups of cross-domain datasets and show that our proposed model achieves state-of-the-art performance.

Downloads

Published

2023-06-26

How to Cite

Ran, H., & Jia, C. (2023). Unsupervised Cross-Domain Rumor Detection with Contrastive Learning and Cross-Attention. Proceedings of the AAAI Conference on Artificial Intelligence, 37(11), 13510-13518. https://doi.org/10.1609/aaai.v37i11.26584

Issue

Section

AAAI Technical Track on Speech & Natural Language Processing