The Role of Adaptive Optimizers for Honest Private Hyperparameter Selection

Authors

  • Shubhankar Mohapatra University of Waterloo
  • Sajin Sasy University of Waterloo
  • Xi He University of Waterloo
  • Gautam Kamath University of Waterloo
  • Om Thakkar Google

DOI:

https://doi.org/10.1609/aaai.v36i7.20749

Keywords:

Machine Learning (ML)

Abstract

Hyperparameter optimization is a ubiquitous challenge in machine learning, and the performance of a trained model depends crucially upon their effective selection. While a rich set of tools exist for this purpose, there are currently no practical hyperparameter selection methods under the constraint of differential privacy (DP). We study honest hyperparameter selection for differentially private machine learning, in which the process of hyperparameter tuning is accounted for in the overall privacy budget. To this end, we i) show that standard composition tools outperform more advanced techniques in many settings, ii) empirically and theoretically demonstrate an intrinsic connection between the learning rate and clipping norm hyperparameters, iii) show that adaptive optimizers like DPAdam enjoy a significant advantage in the process of honest hyperparameter tuning, and iv) draw upon novel limiting behaviour of Adam in the DP setting to design a new and more efficient optimizer.

Downloads

Published

2022-06-28

How to Cite

Mohapatra, S., Sasy, S., He, X., Kamath, G., & Thakkar, O. (2022). The Role of Adaptive Optimizers for Honest Private Hyperparameter Selection. Proceedings of the AAAI Conference on Artificial Intelligence, 36(7), 7806-7813. https://doi.org/10.1609/aaai.v36i7.20749

Issue

Section

AAAI Technical Track on Machine Learning II