Next Generation Active Learning: Mixture of LLMs in the Loop

Authors

  • Yuanyuan Qi Monash University
  • Xiaohao Yang Monash University
  • Jueqing Lu Monash University
  • Guoxiang Guo Monash University
  • Joanne Enticott Monash University
  • Gang Liu Harbin Engineering University
  • Lan Du Monash University

DOI:

https://doi.org/10.1609/aaai.v40i29.39678

Abstract

With the rapid advancement and strong generalization capabilities of large language models (LLMs), they have been increasingly incorporated into the active learning pipelines as annotators to reduce annotation costs. However, considering the annotation quality, labels generated by LLMs often fall short of real-world applicability. To address this, we propose a novel active learning framework, Mixture of LLMs in the Loop Active Learning, replacing human annotators with labels generated through a Mixture-of-LLMs-based annotation model, aimed at enhancing LLM-based annotation robustness by aggregating the strengths of multiple LLMs. To further mitigate the impact of the noisy labels, we introduce annotation discrepancy and negative learning to identify the unreliable annotations and enhance learning effectiveness. Extensive experiments demonstrate that our framework achieves performance comparable to human annotation and consistently outperforms single-LLM baselines and other LLM-ensemble-based approaches. Moreover, our framework is built on lightweight LLMs, enabling it to operate fully on local machines in real-world applications.

Downloads

Published

2026-03-14

How to Cite

Qi, Y., Yang, X., Lu, J., Guo, G., Enticott, J., Liu, G., & Du, L. (2026). Next Generation Active Learning: Mixture of LLMs in the Loop. Proceedings of the AAAI Conference on Artificial Intelligence, 40(29), 24909-24917. https://doi.org/10.1609/aaai.v40i29.39678

Issue

Section

AAAI Technical Track on Machine Learning VI