Tackling Data Heterogeneity in Federated Learning with Class Prototypes
Keywords:ML: Distributed Machine Learning & Federated Learning, CV: Representation Learning for Vision, ML: Classification and Regression
AbstractData heterogeneity across clients in federated learning (FL) settings is a widely acknowledged challenge. In response, personalized federated learning (PFL) emerged as a framework to curate local models for clients' tasks. In PFL, a common strategy is to develop local and global models jointly - the global model (for generalization) informs the local models, and the local models (for personalization) are aggregated to update the global model. A key observation is that if we can improve the generalization ability of local models, then we can improve the generalization of global models, which in turn builds better personalized models. In this work, we consider class imbalance, an overlooked type of data heterogeneity, in the classification setting. We propose FedNH, a novel method that improves the local models' performance for both personalization and generalization by combining the uniformity and semantics of class prototypes. FedNH initially distributes class prototypes uniformly in the latent space and smoothly infuses the class semantics into class prototypes. We show that imposing uniformity helps to combat prototype collapse while infusing class semantics improves local models. Extensive experiments were conducted on popular classification datasets under the cross-device setting. Our results demonstrate the effectiveness and stability of our method over recent works.
How to Cite
Dai, Y., Chen, Z., Li, J., Heinecke, S., Sun, L., & Xu, R. (2023). Tackling Data Heterogeneity in Federated Learning with Class Prototypes. Proceedings of the AAAI Conference on Artificial Intelligence, 37(6), 7314-7322. https://doi.org/10.1609/aaai.v37i6.25891
AAAI Technical Track on Machine Learning I