Addressing Class Imbalance in Federated Learning


  • Lixu Wang Northwestern University
  • Shichao Xu Northwestern University
  • Xiao Wang Northwestern University
  • Qi Zhu Northwestern University


Distributed Machine Learning & Federated Learning, Multi-class/Multi-label Learning & Extreme Classification


Federated learning (FL) is a promising approach for training decentralized data located on local client devices while improving efficiency and privacy. However, the distribution and quantity of the training data on the clients' side may lead to significant challenges such as class imbalance and non-IID (non-independent and identically distributed) data, which could greatly impact the performance of the common model. While much effort has been devoted to helping FL models converge when encountering non-IID data, the imbalance issue has not been sufficiently addressed. In particular, as FL training is executed by exchanging gradients in an encrypted form, the training data is not completely observable to either clients or server, and previous methods for class imbalance do not perform well for FL. Therefore, it is crucial to design new methods for detecting class imbalance in FL and mitigating its impact. In this work, we propose a monitoring scheme that can infer the composition of training data for each FL round, and design a new loss function -- Ratio Loss to mitigate the impact of the imbalance. Our experiments demonstrate the importance of acknowledging class imbalance and taking measures as early as possible in FL training, and the effectiveness of our method in mitigating the impact. Our method is shown to significantly outperform previous methods, while maintaining client privacy.




How to Cite

Wang, L., Xu, S., Wang, X., & Zhu, Q. (2021). Addressing Class Imbalance in Federated Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 35(11), 10165-10173. Retrieved from



AAAI Technical Track on Machine Learning IV