Learning Systems Expansion with Efficient Heterogeneity-aware Knowledge Transfer

Authors

  • Gaole Dai Nanyang Technological University
  • Huatao Xu Hong Kong University of Science and Technology
  • Yifan Yang Microsoft Research Asia
  • Rui Tan Nanyang Technological University
  • Mo Li Hong Kong University of Science and Technology

DOI:

https://doi.org/10.1609/aaai.v40i25.39204

Abstract

Modern AI services must continually adapt to newly joined domains, yet delivering high-quality customized models is hampered by label sparsity, domain shifts, and tight budgets. We formulate this challenge as the learning system expansion problem and introduce HaT, an efficient heterogeneity-aware knowledge-transfer framework. HaT first selects a small set of high-quality source models with minimal overhead, and then fuses their imperfect predictions through a sample-wise attention mixer. Later, it adaptively distills the fused knowledge into target models via a knowledge dictionary. Extensive experiments on different tasks and modalities show that HaT outperforms state-of-the-art baselines by up to 16.5% accuracy, and saves 31.1% training time and up to 93.0% traffic.

Downloads

Published

2026-03-14

How to Cite

Dai, G., Xu, H., Yang, Y., Tan, R., & Li, M. (2026). Learning Systems Expansion with Efficient Heterogeneity-aware Knowledge Transfer. Proceedings of the AAAI Conference on Artificial Intelligence, 40(25), 20667–20675. https://doi.org/10.1609/aaai.v40i25.39204

Issue

Section

AAAI Technical Track on Machine Learning II