FedNS: A Fast Sketching Newton-Type Algorithm for Federated Learning

Authors

  • Jian Li Institute of Information Engineering, Chinese Academy of Sciences
  • Yong Liu Gaoling School of Artificial Intelligence, Renmin University of China
  • Weiping Wang Institute of Information Engineering, Chinese Academy of Sciences

DOI:

https://doi.org/10.1609/aaai.v38i12.29254

Keywords:

ML: Distributed Machine Learning & Federated Learning, ML: Kernel Methods, ML: Learning Theory

Abstract

Recent Newton-type federated learning algorithms have demonstrated linear convergence with respect to the communication rounds. However, communicating Hessian matrices is often unfeasible due to their quadratic communication complexity. In this paper, we introduce a novel approach to tackle this issue while still achieving fast convergence rates. Our proposed method, named as Federated Newton Sketch methods (FedNS), approximates the centralized Newton's method by communicating the sketched square-root Hessian instead of the exact Hessian. To enhance communication efficiency, we reduce the sketch size to match the effective dimension of the Hessian matrix. We provide convergence analysis based on statistical learning for the federated Newton sketch approaches. Specifically, our approaches reach super-linear convergence rates w.r.t. the communication rounds for the first time. We validate the effectiveness of our algorithms through various experiments, which coincide with our theoretical findings.

Downloads

Published

2024-03-24

How to Cite

Li, J., Liu, Y., & Wang, W. (2024). FedNS: A Fast Sketching Newton-Type Algorithm for Federated Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 38(12), 13509-13517. https://doi.org/10.1609/aaai.v38i12.29254

Issue

Section

AAAI Technical Track on Machine Learning III