Face the Facts: Using Face Averaging to Visualize Gender-by-Race Bias in Facial Analysis Algorithms

Authors

  • Kentrell Owens Paul G. Allen School of Computer Science & Engineering, University of Washington
  • Erin Freiburger Psychological and Brain Sciences, Indiana University Bloomington
  • Ryan Hutchings Psychological and Brain Sciences, Indiana University Bloomington
  • Mattea Sim Psychological and Brain Sciences, Indiana University Bloomington
  • Kurt Hugenberg Psychological and Brain Sciences, Indiana University Bloomington
  • Franziska Roesner Paul G. Allen School of Computer Science & Engineering, University of Washington
  • Tadayoshi Kohno Paul G. Allen School of Computer Science & Engineering, University of Washington

DOI:

https://doi.org/10.1609/aies.v7i1.31707

Abstract

We applied techniques from psychology --- typically used to visualize human bias --- to facial analysis systems, providing novel approaches for diagnosing and communicating algorithmic bias. First, we aggregated a diverse corpus of human facial images (N=1492) with self-identified gender and race. We tested four automated gender recognition (AGR) systems and found that some exhibited intersectional gender-by-race biases. Employing a technique developed by psychologists --- face averaging --- we created composite images to visualize these systems' outputs. For example, we visualized what an "average woman" looks like, according to a system's output. Second, we conducted two online experiments wherein participants judged the bias of hypothetical AGR systems. The first experiment involved participants (N=228) from a convenience sample. When depicting the same results in different formats, facial visualizations communicated bias to the same magnitude as statistics. In the second experiment with only Black participants (N=223), facial visualizations communicated bias significantly more than statistics, suggesting that face averages are meaningful for communicating algorithmic bias.

Downloads

Published

2024-10-16

How to Cite

Owens, K., Freiburger, E., Hutchings, R., Sim, M., Hugenberg, K., Roesner, F., & Kohno, T. (2024). Face the Facts: Using Face Averaging to Visualize Gender-by-Race Bias in Facial Analysis Algorithms. Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, 7(1), 1101-1111. https://doi.org/10.1609/aies.v7i1.31707