Knowledge Transfer via Compact Model in Federated Learning (Student Abstract)

Authors

  • Jiaming Pei University of Sydney
  • Wei Li University of Sydney
  • Lukun Wang Shandong University of Science and Technology

DOI:

https://doi.org/10.1609/aaai.v38i21.30498

Keywords:

Federated Learning, Knowledge Transfer, Communication Efficiency

Abstract

Communication overhead remains a significant challenge in federated learning due to frequent global model updates. Essentially, the update of the global model can be viewed as knowledge transfer. We aim to transfer more knowledge through a compact model while reducing communication overhead. In our study, we introduce a federated learning framework where clients pre-train large models locally and the server initializes a compact model to communicate. This compact model should be light in size but still have enough knowledge to refine the global model effectively. We facilitate the knowledge transfer from local to global models based on pre-training outcomes. Our experiments show that our approach significantly reduce communication overhead without sacrificing accuracy.

Published

2024-03-24

How to Cite

Pei, J., Li, W., & Wang, L. (2024). Knowledge Transfer via Compact Model in Federated Learning (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 38(21), 23621-23622. https://doi.org/10.1609/aaai.v38i21.30498