Investigating What Factors Influence Users’ Rating of Harmful Algorithmic Bias and Discrimination

Authors

  • Sara Kingsley Carnegie Mellon University
  • Jiayin Zhi Carnegie Mellon University
  • Wesley Hanwen Deng Carnegie Mellon University
  • Jaimie Lee Carnegie Mellon University
  • Sizhe Zhang Carnegie Mellon University
  • Motahhare Eslami Carnegie Mellon University
  • Kenneth Holstein Carnegie Mellon University
  • Jason I. Hong Carnegie Mellon University
  • Tianshi Li Northeastern University
  • Hong Shen Carnegie Mellon University

DOI:

https://doi.org/10.1609/hcomp.v12i1.31602

Abstract

There has been growing recognition of the crucial role users, especially those from marginalized groups, play in uncovering harmful algorithmic biases. However, it remains unclear how users’ identities and experiences might impact their rating of harmful biases. We present an online experiment (N=2,197) examining these factors: demographics, discrimination experiences, and social and technical knowledge. Participants were shown examples of image search results, including ones that previous literature has identified as biased against marginalized racial, gender, or sexual orientation groups. We found participants from marginalized gender or sexual orientation groups were more likely to rate the examples as more severely harmful. Belonging to marginalized races did not have a similar pattern. Additional factors affecting users’ ratings included discrimination experiences, and having friends or family belonging to marginalized demographics. A qualitative analysis offers insights into users' bias recognition, and why they see biases the way they do. We provide guidance for designing future methods to support effective user-driven auditing.

Downloads

Published

2024-10-14

How to Cite

Kingsley, S., Zhi, J., Deng, W. H., Lee, J., Zhang, S., Eslami, M., Holstein, K., Hong, J. I., Li, T., & Shen, H. (2024). Investigating What Factors Influence Users’ Rating of Harmful Algorithmic Bias and Discrimination. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 12(1), 75-85. https://doi.org/10.1609/hcomp.v12i1.31602