Contextualized Non-Local Neural Networks for Sequence Learning


  • Pengfei Liu Fudan University
  • Shuaichen Chang The Ohio State University
  • Xuanjing Huang Fudan University
  • Jian Tang University of Montreal
  • Jackie Chi Kit Cheung McGill University & The Ohio State University



Recently, a large number of neural mechanisms and models have been proposed for sequence learning, of which selfattention, as exemplified by the Transformer model, and graph neural networks (GNNs) have attracted much attention. In this paper, we propose an approach that combines and draws on the complementary strengths of these two methods. Specifically, we propose contextualized non-local neural networks (CN3), which can both dynamically construct a task-specific structure of a sentence and leverage rich local dependencies within a particular neighbourhood.

Experimental results on ten NLP tasks in text classification, semantic matching, and sequence labelling show that our proposed model outperforms competitive baselines and discovers task-specific dependency structures, thus providing better interpretability to users.




How to Cite

Liu, P., Chang, S., Huang, X., Tang, J., & Cheung, J. C. K. (2019). Contextualized Non-Local Neural Networks for Sequence Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 6762-6769.



AAAI Technical Track: Natural Language Processing