Personalized Federated Learning with Bidirectional Communication Compression via One-Bit Random Sketching

Authors

  • Jiacheng Cheng School of Automation, Northwestern Polytechnical University, Xi'an, China
  • Xu Zhang School of Artificial Intelligence, Xidian University, Xi'an, China
  • Guanghui Qiu Science and Technology on Electronic Information Control Laboratory, China Academy of Electronics and Information Technology, Chengdu, China
  • Yifang Zhang School of Artificial Intelligence, Xidian University, Xi'an, China Science and Technology on Electronic Information Control Laboratory, China Academy of Electronics and Information Technology, Chengdu, China
  • Yinchuan Li Knowin AI, Shenzhen, China
  • Kaiyuan Feng Academy of Advanced Interdisciplinary Research, Xidian University, Xi'an, China

DOI:

https://doi.org/10.1609/aaai.v40i25.39185

Abstract

Federated Learning (FL) enables collaborative training across decentralized data, but faces key challenges of bidirectional communication overhead and client-side data heterogeneity. To address communication costs while embracing data heterogeneity, we propose pFed1BS, a novel personalized federated learning framework that achieves extreme communication compression through one-bit random sketching. In personalized FL, the goal shifts from training a single global model to creating tailored models for each client. In our framework, clients transmit highly compressed one-bit sketches, and the server aggregates and broadcasts a global one-bit consensus. To enable effective personalization, we introduce a sign-based regularizer that guides local models to align with the global consensus while preserving local data characteristics. To mitigate the computational burden of random sketching, we employ the Fast Hadamard Transform for efficient projection. Theoretical analysis guarantees that our algorithm converges to a stationary neighborhood of the global potential function. Numerical simulations demonstrate that pFed1BS substantially reduces communication costs while achieving competitive performance compared to advanced communication-efficient FL algorithms.

Downloads

Published

2026-03-14

How to Cite

Cheng, J., Zhang, X., Qiu, G., Zhang, Y., Li, Y., & Feng, K. (2026). Personalized Federated Learning with Bidirectional Communication Compression via One-Bit Random Sketching. Proceedings of the AAAI Conference on Artificial Intelligence, 40(25), 20499–20508. https://doi.org/10.1609/aaai.v40i25.39185

Issue

Section

AAAI Technical Track on Machine Learning II