z-SignFedAvg: A Unified Stochastic Sign-Based Compression for Federated Learning

Authors

  • Zhiwei Tang School of Science and Engineering, The Chinese University of Hong Kong, Shenzhen, China Shenzhen Research Institute of Big Data, Shenzhen, China
  • Yanmeng Wang School of Science and Engineering, The Chinese University of Hong Kong, Shenzhen, China Shenzhen Research Institute of Big Data, Shenzhen, China
  • Tsung-Hui Chang School of Science and Engineering, The Chinese University of Hong Kong, Shenzhen, China Shenzhen Research Institute of Big Data, Shenzhen, China

DOI:

https://doi.org/10.1609/aaai.v38i14.29454

Keywords:

ML: Distributed Machine Learning & Federated Learning, SO: Non-convex Optimization

Abstract

Federated Learning (FL) is a promising privacy-preserving distributed learning paradigm but suffers from high communi- cation cost when training large-scale machine learning models. Sign-based methods, such as SignSGD, have been proposed as a biased gradient compression technique for reducing the communication cost. However, sign-based algorithms could diverge under heterogeneous data, which thus motivated the de- velopment of advanced techniques, such as the error-feedback method and stochastic sign-based compression, to fix this issue. Nevertheless, these methods still suffer from slower convergence rates, and none of them allows multiple local SGD updates like FedAvg. In this paper, we propose a novel noisy perturbation scheme with a general symmetric noise distribution for sign-based compression, which not only al- lows one to flexibly control the bias-variance tradeoff for the compressed gradient, but also provides a unified viewpoint to existing stochastic sign-based methods. More importantly, the proposed scheme enables the development of the very first sign-based FedAvg algorithm (z-SignFedAvg) to accelerate the convergence. Theoretically, we show that z-SignFedAvg achieves a faster convergence rate than existing sign-based methods and, under the uniformly distributed noise, can enjoy the same convergence rate as its uncompressed counterpart. Extensive experiments are conducted to demonstrate that the z-SignFedAvg can achieve competitive empirical performance on real datasets and outperforms existing schemes.

Published

2024-03-24

How to Cite

Tang, Z., Wang, Y., & Chang, T.-H. (2024). z-SignFedAvg: A Unified Stochastic Sign-Based Compression for Federated Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 38(14), 15301-15309. https://doi.org/10.1609/aaai.v38i14.29454

Issue

Section

AAAI Technical Track on Machine Learning V