Differentially Private Stochastic Coordinate Descent

Authors

  • Georgios Damaskinos École Polytechnique Fédérale de Lausanne (EPFL)
  • Celestine Mendler-Dünner University of California, Berkeley
  • Rachid Guerraoui École Polytechnique Fédérale de Lausanne (EPFL)
  • Nikolaos Papandreou IBM Research Zurich
  • Thomas Parnell IBM Research

DOI:

https://doi.org/10.1609/aaai.v35i8.16882

Keywords:

Ethics -- Bias, Fairness, Transparency & Privacy, Optimization

Abstract

In this paper we tackle the challenge of making the stochastic coordinate descent algorithm differentially private. Compared to the classical gradient descent algorithm where updates operate on a single model vector and controlled noise addition to this vector suffices to hide critical information about individuals, stochastic coordinate descent crucially relies on keeping auxiliary information in memory during training. This auxiliary information provides an additional privacy leak and poses the major challenge addressed in this work. Driven by the insight that under independent noise addition, the consistency of the auxiliary information holds in expectation, we present DP-SCD, the first differentially private stochastic coordinate descent algorithm. We analyze our new method theoretically and argue that decoupling and parallelizing coordinate updates is essential for its utility. On the empirical side we demonstrate competitive performance against the popular stochastic gradient descent alternative (DP-SGD) while requiring significantly less tuning.

Downloads

Published

2021-05-18

How to Cite

Damaskinos, G., Mendler-Dünner, C., Guerraoui, R., Papandreou, N., & Parnell, T. (2021). Differentially Private Stochastic Coordinate Descent. Proceedings of the AAAI Conference on Artificial Intelligence, 35(8), 7176-7184. https://doi.org/10.1609/aaai.v35i8.16882

Issue

Section

AAAI Technical Track on Machine Learning I