AsyncFL: Asynchronous Federated Learning Using Majority Voting with Quantized Model Updates (Student Abstract)

Authors

  • Suji Jang Gwangju Institute of Science and Technology (GIST)
  • Hyuk Lim Gwangju Institute of Science and Technology (GIST)

DOI:

https://doi.org/10.1609/aaai.v36i11.21624

Keywords:

Asynchronous Federated Learning, Quantization, Majority Voting

Abstract

Federated learning (FL) performs the global model updating in a synchronous manner in that the FL server waits for a specific number of local models from distributed devices before computing and sharing a new global model. We propose asynchronous federated learning (AsyncFL), which allows each client to continuously upload its model based on its capabilities and the FL server to determine when to asynchronously update and broadcast the global model. The asynchronous model aggregation at the FL server is performed by the Boyer–Moore majority voting algorithm for the k-bit quantized weight values. The proposed FL can speed up the convergence of the global model learning early in the FL process and reduce data exchange once the model is converged.

Downloads

Published

2022-06-28

How to Cite

Jang, S., & Lim, H. (2022). AsyncFL: Asynchronous Federated Learning Using Majority Voting with Quantized Model Updates (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 36(11), 12975-12976. https://doi.org/10.1609/aaai.v36i11.21624