Fast and Efficient MMD-Based Fair PCA via Optimization over Stiefel Manifold

Authors

  • Junghyun Lee Kim Jaechul Graduate School of AI, KAIST, Seoul, Republic of Korea
  • Gwangsu Kim School of Electrical Engineering, KAIST, Daejeon, Republic of Korea
  • Mahbod Olfat UC Berkeley IEOR, Berkeley, CA, USA Citadel, Chicago, IL, USA
  • Mark Hasegawa-Johnson Department of Electrical and Computer Engineering, University of Illinois Urbana-Champaign, IL, USA
  • Chang D. Yoo School of Electrical Engineering, KAIST, Daejeon, Republic of Korea

DOI:

https://doi.org/10.1609/aaai.v36i7.20699

Keywords:

Machine Learning (ML), Philosophy And Ethics Of AI (PEAI)

Abstract

This paper defines fair principal component analysis (PCA) as minimizing the maximum mean discrepancy (MMD) between the dimensionality-reduced conditional distributions of different protected classes. The incorporation of MMD naturally leads to an exact and tractable mathematical formulation of fairness with good statistical properties. We formulate the problem of fair PCA subject to MMD constraints as a non-convex optimization over the Stiefel manifold and solve it using the Riemannian Exact Penalty Method with Smoothing (REPMS). Importantly, we provide a local optimality guarantee and explicitly show the theoretical effect of each hyperparameter in practical settings, extending previous results. Experimental comparisons based on synthetic and UCI datasets show that our approach outperforms prior work in explained variance, fairness, and runtime.

Downloads

Published

2022-06-28

How to Cite

Lee, J., Kim, G., Olfat, M., Hasegawa-Johnson, M., & Yoo, C. D. (2022). Fast and Efficient MMD-Based Fair PCA via Optimization over Stiefel Manifold. Proceedings of the AAAI Conference on Artificial Intelligence, 36(7), 7363-7371. https://doi.org/10.1609/aaai.v36i7.20699

Issue

Section

AAAI Technical Track on Machine Learning II