Does Head Label Help for Long-Tailed Multi-Label Text Classification

Authors

  • Lin Xiao Beijing Jiaotong University
  • Xiangliang Zhang King Abdullah University of Science and Technology, Saudi Arabia
  • Liping Jing Beijing Jiaotong University
  • Chi Huang Beijing Jiaotong University
  • Mingyang Song Beijing Jiaotong University

DOI:

https://doi.org/10.1609/aaai.v35i16.17660

Keywords:

Text Classification & Sentiment Analysis

Abstract

Multi-label text classification (MLTC) aims to annotate documents with the most relevant labels from a number of candidate labels. In real applications, the distribution of label frequency often exhibits a long tail, i.e., a few labels are associated with a large number of documents (a.k.a. head labels), while a large fraction of labels are associated with a small number of documents (a.k.a. tail labels). To address the challenge of insufficient training data on tail label classification, we propose a Head-to-Tail Network (HTTN) to transfer the meta-knowledge from the data-rich head labels to data-poor tail labels. The meta-knowledge is the mapping from few-shot network parameters to many-shot network parameters, which aims to promote the generalizability of tail classifiers. Extensive experimental results on three benchmark datasets demonstrate that HTTN consistently outperforms the state-of-the-art methods. The code and hyper-parameter settings are released for reproducibility.

Downloads

Published

2021-05-18

How to Cite

Xiao, L., Zhang, X., Jing, L., Huang, C., & Song, M. (2021). Does Head Label Help for Long-Tailed Multi-Label Text Classification. Proceedings of the AAAI Conference on Artificial Intelligence, 35(16), 14103-14111. https://doi.org/10.1609/aaai.v35i16.17660

Issue

Section

AAAI Technical Track on Speech and Natural Language Processing III