Bayesian Federated Neural Matching That Completes Full Information

Authors

  • Peng Xiao Tongji University
  • Samuel Cheng University of Oklahoma

DOI:

https://doi.org/10.1609/aaai.v37i9.26245

Keywords:

ML: Bayesian Learning, ML: Distributed Machine Learning & Federated Learning

Abstract

Federated learning is a contemporary machine learning paradigm where locally trained models are distilled into a global model. Due to the intrinsic permutation invariance of neural networks, Probabilistic Federated Neural Matching (PFNM) employs a Bayesian nonparametric framework in the generation process of local neurons, and then creates a linear sum assignment formulation in each alternative optimization iteration. But according to our theoretical analysis, the optimization iteration in PFNM omits global information from existing. In this study, we propose a novel approach that overcomes this flaw by introducing a Kullback-Leibler divergence penalty at each iteration. The effectiveness of our approach is demonstrated by experiments on both image classification and semantic segmentation tasks.

Downloads

Published

2023-06-26

How to Cite

Xiao, P., & Cheng, S. (2023). Bayesian Federated Neural Matching That Completes Full Information. Proceedings of the AAAI Conference on Artificial Intelligence, 37(9), 10473-10480. https://doi.org/10.1609/aaai.v37i9.26245

Issue

Section

AAAI Technical Track on Machine Learning IV