FedSoft: Soft Clustered Federated Learning with Proximal Local Updating

Authors

  • Yichen Ruan Carnegie Mellon University
  • Carlee Joe-Wong Carnegie Mellon University

DOI:

https://doi.org/10.1609/aaai.v36i7.20785

Keywords:

Machine Learning (ML)

Abstract

Traditionally, clustered federated learning groups clients with the same data distribution into a cluster, so that every client is uniquely associated with one data distribution and helps train a model for this distribution. We relax this hard association assumption to soft clustered federated learning, which allows every local dataset to follow a mixture of multiple source distributions. We propose FedSoft, which trains both locally personalized models and high-quality cluster models in this setting. FedSoft limits client workload by using proximal updates to require the completion of only one optimization task from a subset of clients in every communication round. We show, analytically and empirically, that FedSoft effectively exploits similarities between the source distributions to learn personalized and cluster models that perform well.

Downloads

Published

2022-06-28

How to Cite

Ruan, Y., & Joe-Wong, C. (2022). FedSoft: Soft Clustered Federated Learning with Proximal Local Updating. Proceedings of the AAAI Conference on Artificial Intelligence, 36(7), 8124-8131. https://doi.org/10.1609/aaai.v36i7.20785

Issue

Section

AAAI Technical Track on Machine Learning II