Bias and Variance of Post-processing in Differential Privacy

Authors

  • Keyu Zhu Georgia Institute of Technology
  • Pascal Van Hentenryck Georgia Institute of Technology
  • Ferdinando Fioretto Syracuse University

Keywords:

Ethics -- Bias, Fairness, Transparency & Privacy

Abstract

Post-processing immunity is a fundamental property of differential privacy: it enables the application of arbitrary data-independent transformations to the results of differentially private outputs without affecting their privacy guarantees. When query outputs must satisfy domain constraints, post-processing can be used to project them back onto the feasibility region. Moreover, when the feasible region is convex, a widely adopted class of post-processing steps is also guaranteed to improve accuracy. Post-processing has been applied successfully in many applications including census data, energy systems, and mobility. However, its effects on the noise distribution is poorly understood: It is often argued that post-processing may introduce bias and increase variance. This paper takes a first step towards understanding the properties of post-processing. It considers the release of census data and examines, both empirically and theoretically, the behavior of a widely adopted class of post-processing functions.

Downloads

Published

2021-05-18

How to Cite

Zhu, K., Van Hentenryck, P., & Fioretto, F. (2021). Bias and Variance of Post-processing in Differential Privacy. Proceedings of the AAAI Conference on Artificial Intelligence, 35(12), 11177-11184. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17333

Issue

Section

AAAI Technical Track on Machine Learning V