Fairness for Robust Log Loss Classification

Authors

  • Ashkan Rezaei UIC
  • Rizal Fathony CMU
  • Omid Memarrast UIC
  • Brian Ziebart UIC

DOI:

https://doi.org/10.1609/aaai.v34i04.6002

Abstract

Developing classification methods with high accuracy that also avoid unfair treatment of different groups has become increasingly important for data-driven decision making in social applications. Many existing methods enforce fairness constraints on a selected classifier (e.g., logistic regression) by directly forming constrained optimizations. We instead re-derive a new classifier from the first principles of distributional robustness that incorporates fairness criteria into a worst-case logarithmic loss minimization. This construction takes the form of a minimax game and produces a parametric exponential family conditional distribution that resembles truncated logistic regression. We present the theoretical benefits of our approach in terms of its convexity and asymptotic convergence. We then demonstrate the practical advantages of our approach on three benchmark fairness datasets.

Downloads

Published

2020-04-03

How to Cite

Rezaei, A., Fathony, R., Memarrast, O., & Ziebart, B. (2020). Fairness for Robust Log Loss Classification. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 5511-5518. https://doi.org/10.1609/aaai.v34i04.6002

Issue

Section

AAAI Technical Track: Machine Learning