DPAUC: Differentially Private AUC Computation in Federated Learning

Authors

  • Jiankai Sun ByteDance Inc.
  • Xin Yang ByteDance Inc.
  • Yuanshun Yao ByteDance Inc.
  • Junyuan Xie ByteDance Ltd.
  • Di Wu ByteDance Ltd.
  • Chong Wang Apple Inc.

DOI:

https://doi.org/10.1609/aaai.v37i12.26770

Keywords:

General

Abstract

Federated learning (FL) has gained significant attention recently as a privacy-enhancing tool to jointly train a machine learning model by multiple participants. The prior work on FL has mostly studied how to protect label privacy during model training. However, model evaluation in FL might also lead to the potential leakage of private label information. In this work, we propose an evaluation algorithm that can accurately compute the widely used AUC (area under the curve) metric when using the label differential privacy (DP) in FL. Through extensive experiments, we show our algorithms can compute accurate AUCs compared to the ground truth. The code is available at https://github.com/bytedance/fedlearner/tree/master/example/privacy/DPAUC

Downloads

Published

2023-06-26

How to Cite

Sun, J., Yang, X., Yao, Y., Xie, J., Wu, D., & Wang, C. (2023). DPAUC: Differentially Private AUC Computation in Federated Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 37(12), 15170-15178. https://doi.org/10.1609/aaai.v37i12.26770

Issue

Section

AAAI Special Track on Safe and Robust AI