LET: Linguistic Knowledge Enhanced Graph Transformer for Chinese Short Text Matching

Authors

  • Boer Lyu X-LANCE Lab, Department of Computer Science and Engineering, Shanghai Jiao Tong University
  • Lu Chen X-LANCE Lab, Department of Computer Science and Engineering, Shanghai Jiao Tong University
  • Su Zhu X-LANCE Lab, Department of Computer Science and Engineering, Shanghai Jiao Tong University
  • Kai Yu X-LANCE Lab, Department of Computer Science and Engineering, Shanghai Jiao Tong University

DOI:

https://doi.org/10.1609/aaai.v35i15.17592

Keywords:

Text Classification & Sentiment Analysis

Abstract

Chinese short text matching is a fundamental task in natural language processing. Existing approaches usually take Chinese characters or words as input tokens. They have two limitations: 1) Some Chinese words are polysemous, and semantic information is not fully utilized. 2) Some models suffer potential issues caused by word segmentation. Here we introduce HowNet as an external knowledge base and propose a Linguistic knowledge Enhanced graph Transformer (LET) to deal with word ambiguity. Additionally, we adopt the word lattice graph as input to maintain multi-granularity information. Our model is also complementary to pre-trained language models. Experimental results on two Chinese datasets show that our models outperform various typical text matching approaches. Ablation study also indicates that both semantic information and multi-granularity information are important for text matching modeling.

Downloads

Published

2021-05-18

How to Cite

Lyu, B., Chen, L., Zhu, S., & Yu, K. (2021). LET: Linguistic Knowledge Enhanced Graph Transformer for Chinese Short Text Matching. Proceedings of the AAAI Conference on Artificial Intelligence, 35(15), 13498-13506. https://doi.org/10.1609/aaai.v35i15.17592

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing II