Hybrid Attention-Based Prototypical Networks for Noisy Few-Shot Relation Classification


  • Tianyu Gao Tsinghua University
  • Xu Han Tsinghua University
  • Zhiyuan Liu Tsinghua University
  • Maosong Sun Tsinghua University




The existing methods for relation classification (RC) primarily rely on distant supervision (DS) because large-scale supervised training datasets are not readily available. Although DS automatically annotates adequate amounts of data for model training, the coverage of this data is still quite limited, and meanwhile many long-tail relations still suffer from data sparsity. Intuitively, people can grasp new knowledge by learning few instances. We thus provide a different view on RC by formalizing RC as a few-shot learning (FSL) problem. However, the current FSL models mainly focus on low-noise vision tasks, which makes them hard to directly deal with the diversity and noise of text. In this paper, we propose hybrid attention-based prototypical networks for the problem of noisy few-shot RC. We design instancelevel and feature-level attention schemes based on prototypical networks to highlight the crucial instances and features respectively, which significantly enhances the performance and robustness of RC models in a noisy FSL scenario. Besides, our attention schemes accelerate the convergence speed of RC models. Experimental results demonstrate that our hybrid attention-based models require fewer training iterations and outperform the state-of-the-art baseline models. The code and datasets are released on https://github.com/thunlp/ HATT-Proto.




How to Cite

Gao, T., Han, X., Liu, Z., & Sun, M. (2019). Hybrid Attention-Based Prototypical Networks for Noisy Few-Shot Relation Classification. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 6407-6414. https://doi.org/10.1609/aaai.v33i01.33016407



AAAI Technical Track: Natural Language Processing