Manifold-Based Verbalizer Space Re-embedding for Tuning-Free Prompt-Based Classification


  • Haochun Wang Harbin Institute of Technology
  • Sendong Zhao Harbin Institute of Technology
  • Chi Liu Harbin Institute of Technology
  • Nuwa Xi Harbin Institute of Technology
  • MuZhen Cai Harbin Institute of Technology
  • Bing Qin Harbin Institute of Technology
  • Ting Liu Harbin Institute of Technology



NLP: Text Classification, NLP: Applications


Prompt-based classification adapts tasks to a cloze question format utilizing the [MASK] token and the filled tokens are then mapped to labels through pre-defined verbalizers. Recent studies have explored the use of verbalizer embeddings to reduce labor in this process. However, all existing studies require a tuning process for either the pre-trained models or additional trainable embeddings. Meanwhile, the distance between high-dimensional verbalizer embeddings should not be measured by Euclidean distance due to the potential for non-linear manifolds in the representation space. In this study, we propose a tuning-free manifold-based space re-embedding method called Locally Linear Embedding with Intra-class Neighborhood Constraint (LLE-INC) for verbalizer embeddings, which preserves local properties within the same class as guidance for classification. Experimental results indicate that even without tuning any parameters, our LLE-INC is on par with automated verbalizers with parameter tuning. And with the parameter updating, our approach further enhances prompt-based tuning by up to 3.2%. Furthermore, experiments with the LLaMA-7B&13B indicate that LLE-INC is an efficient tuning-free classification approach for the hyper-scale language models.




How to Cite

Wang, H., Zhao, S., Liu, C., Xi, N., Cai, M., Qin , B., & Liu, T. (2024). Manifold-Based Verbalizer Space Re-embedding for Tuning-Free Prompt-Based Classification. Proceedings of the AAAI Conference on Artificial Intelligence, 38(17), 19126-19134.



AAAI Technical Track on Natural Language Processing II