Labels Need Prompts Too: Mask Matching for Natural Language Understanding Tasks

Authors

  • Bo Li Peking University
  • Wei Ye Peking University
  • Quansen Wang Boston University
  • Wen Zhao Peking University
  • Shikun Zhang Peking University

DOI:

https://doi.org/10.1609/aaai.v38i16.29806

Keywords:

NLP: Other

Abstract

Textual label names (descriptions) are typically semantically rich in many natural language understanding (NLU) tasks. In this paper, we incorporate the prompting methodology, which is widely used to enrich model input, into the label side for the first time. Specifically, we propose a Mask Matching method, which equips an input with a prompt and its label with another, and then makes predictions by matching their mask representations. We evaluate our method extensively on 8 NLU tasks with 14 datasets. The experimental results show that Mask Matching significantly outperforms its counterparts of fine-tuning and conventional prompt-tuning, setting up state-of-the-art performances in several datasets. Mask Matching is particularly good at handling NLU tasks with large label counts and informative label names. As pioneering efforts that investigate the label-side prompt, we also discuss open issues for future study.

Published

2024-03-24

How to Cite

Li, B., Ye, W., Wang, Q., Zhao, W., & Zhang, S. (2024). Labels Need Prompts Too: Mask Matching for Natural Language Understanding Tasks. Proceedings of the AAAI Conference on Artificial Intelligence, 38(16), 18453-18461. https://doi.org/10.1609/aaai.v38i16.29806

Issue

Section

AAAI Technical Track on Natural Language Processing I