KEML: A Knowledge-Enriched Meta-Learning Framework for Lexical Relation Classification

Authors

  • Chengyu Wang Zhejiang Lab Alibaba Group
  • Minghui Qiu Alibaba Group
  • Jun Huang Alibaba Group
  • Xiaofeng He East China Normal University

DOI:

https://doi.org/10.1609/aaai.v35i15.17640

Keywords:

Lexical & Frame Semantics, Semantic Parsing, Information Extraction, Knowledge Acquisition, General

Abstract

Lexical relations describe how concepts are semantically related, in the form of relation triples. The accurate prediction of lexical relations between concepts is challenging, due to the sparsity of patterns indicating the existence of such relations. We propose the Knowledge-Enriched Meta-Learning (KEML) framework to address lexical relation classification. In KEML, the LKB-BERT (Lexical Knowledge Base-BERT) model is first presented to learn concept representations from text corpora, with rich lexical knowledge injected by distant supervision. A probabilistic distribution of auxiliary tasks is defined to increase the model's ability to recognize different types of lexical relations. We further propose a neural classifier integrated with special relation recognition cells, in order to combine meta-learning over the auxiliary task distribution and supervised learning for LRC. Experiments over multiple datasets show KEML outperforms state-of-the-art methods.

Downloads

Published

2021-05-18

How to Cite

Wang, C., Qiu, M., Huang, J., & He, X. (2021). KEML: A Knowledge-Enriched Meta-Learning Framework for Lexical Relation Classification. Proceedings of the AAAI Conference on Artificial Intelligence, 35(15), 13924-13932. https://doi.org/10.1609/aaai.v35i15.17640

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing II