Secure Bilevel Asynchronous Vertical Federated Learning with Backward Updating


  • Qingsong Zhang Xidian University JD Tech
  • Bin Gu MBZUAI JD Finance America Corporation
  • Cheng Deng Xidian University
  • Heng Huang University of Pittsburgh JD Finance America Corporation



Distributed Machine Learning & Federated Learning


Vertical federated learning (VFL) attracts increasing attention due to the emerging demands of multi-party collaborative modeling and concerns of privacy leakage. In the real VFL applications, usually only one or partial parties hold labels, which makes it challenging for all parties to collaboratively learn the model without privacy leakage. Meanwhile, most existing VFL algorithms are trapped in the synchronous computations, which leads to inefficiency in their real-world applications. To address these challenging problems, we propose a novel VFL framework integrated with new backward updating mechanism and bilevel asynchronous parallel architecture (VFB^2), under which three new algorithms, including VFB^2-SGD, -SVRG, and -SAGA, are proposed. We derive the theoretical results of the convergence rates of these three algorithms under both strongly convex and nonconvex conditions. We also prove the security of VFB^2 under semi-honest threat models. Extensive experiments on benchmark datasets demonstrate that our algorithms are efficient, scalable, and lossless.




How to Cite

Zhang, Q., Gu, B., Deng, C., & Huang, H. (2021). Secure Bilevel Asynchronous Vertical Federated Learning with Backward Updating. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 10896-10904.



AAAI Technical Track on Machine Learning V