FedPKDA: Personalized Federated Learning with Privacy-Preserving Knowledge Dynamic Alignment

Authors

  • Moxuan Zeng School of Computer Science and Technology, Hainan University, Haikou, China
  • Wenxuan Tu School of Computer Science and Technology, Hainan University, Haikou, China Hainan Blockchain Technology Engineering Research Center, Haikou, China
  • Yuanyi Chen School of Computer Science and Technology, Hainan University, Haikou, China
  • Yiying Wang School of Computer Science and Technology, Hainan University, Haikou, China
  • Miao Yu Hainan Blockchain Technology Engineering Research Center, Haikou, China
  • Xiangyan Tang School of Computer Science and Technology, Hainan University, Haikou, China Hainan Blockchain Technology Engineering Research Center, Haikou, China
  • Jieren Cheng School of Computer Science and Technology, Hainan University, Haikou, China Hainan Blockchain Technology Engineering Research Center, Haikou, China

DOI:

https://doi.org/10.1609/aaai.v40i33.40037

Abstract

Personalized Federated Learning (PFL), which aims to customize models for each client while preserving data privacy, has become an important research topic in addressing the challenges of data heterogeneity. Existing studies usually enhance the localization of global parameters by injecting local information into the globally shared model. However, these methods focus excessively on the personalized characteristics of individual clients and fail to fully exploit distinctive information across clients, limiting the quality of local models to represent unseen samples well. To address this issue, we propose a novel personalized Federated Privacy-preserving Knowledge Dynamic Alignment (FedPKDA) framework, which ensures data privacy during both the collection of client-side key information and its incorporation into federated model training. Specifically, to ensure data privacy during the cross-client information collection phase, we first conduct feature clipping and add Laplacian noise to the local prototypes extracted from each client. Further, we compute the centroid of the uploaded local prototypes in a latent space and leverage Mahalanobis distance to guide the generation of global prototypes, thereby preserving the semantic contributions from participating clients. Moreover, to boost the personalization of the local model, we dynamically align representations learned by the shared model with both a set of local prototypes and privacy-preserving global prototypes, facilitating effective cross-client knowledge sharing under heterogeneous settings while preserving client-specific characteristics. Extensive experiments on benchmark datasets have verified the superiority of FedPKDA against its competitors.

Downloads

Published

2026-03-14

How to Cite

Zeng, M., Tu, W., Chen, Y., Wang, Y., Yu, M., Tang, X., & Cheng, J. (2026). FedPKDA: Personalized Federated Learning with Privacy-Preserving Knowledge Dynamic Alignment. Proceedings of the AAAI Conference on Artificial Intelligence, 40(33), 28113–28121. https://doi.org/10.1609/aaai.v40i33.40037

Issue

Section

AAAI Technical Track on Machine Learning X