Knowledge Transfer via Compact Model in Federated Learning (Student Abstract)
DOI:
https://doi.org/10.1609/aaai.v38i21.30498Keywords:
Federated Learning, Knowledge Transfer, Communication EfficiencyAbstract
Communication overhead remains a significant challenge in federated learning due to frequent global model updates. Essentially, the update of the global model can be viewed as knowledge transfer. We aim to transfer more knowledge through a compact model while reducing communication overhead. In our study, we introduce a federated learning framework where clients pre-train large models locally and the server initializes a compact model to communicate. This compact model should be light in size but still have enough knowledge to refine the global model effectively. We facilitate the knowledge transfer from local to global models based on pre-training outcomes. Our experiments show that our approach significantly reduce communication overhead without sacrificing accuracy.Downloads
Published
2024-03-24
How to Cite
Pei, J., Li, W., & Wang, L. (2024). Knowledge Transfer via Compact Model in Federated Learning (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 38(21), 23621-23622. https://doi.org/10.1609/aaai.v38i21.30498
Issue
Section
AAAI Student Abstract and Poster Program