Hey, Siri! Why Are You Biased against Women? (Student Abstract)

Authors

  • Surakshya Aryal Howard University
  • Mikel K. Ngueajio Howard University
  • Saurav Keshari Aryal Howard University
  • Gloria Washington Howard University

DOI:

https://doi.org/10.1609/aaai.v37i13.26937

Keywords:

Bias, Gender Bias, Sexual Abuse, Voice Assistant, Automatic Speech Recognition, Misogyny, Inclusivity, Bias Mitigation, Patriarchy, Harassment, Discrimination, Fairness, Ethics, Social AI, Sexism, Inequality, Social Biases, Human Computer Interaction, Machine Learning, Artificial Intelligence, Natural Language Processing, Applications Of AI, Gender, Alexa, Siri, Google Assistant, Cortana, Amazon, Apple, Google, Microsoft, Women, Stereotypes, Human-AI Interaction, ML: Applications

Abstract

The intersection of pervasive technology and verbal communication has resulted in the creation of Automatic Speech Recognition Systems (ASRs), which automate the conversion of spontaneous speech into texts. ASR enables human-computer interactions through speech and is rapidly integrated into our daily lives. However, the research studies on current ASR technologies have reported unfulfilled social inclusivity and accentuated biases and stereotypes towards minorities. In this work, we provide a review of examples and evidence to demonstrate preexisting sexist behavior in ASR systems through a systematic review of research literature over the past five years. For each article, we also provide the ASR technology used, highlight specific instances of reported bias, discuss the impact of this bias on the female community, and suggest possible methods of mitigation. We believe this paper will provide insights into the harm that unchecked AI-powered technologies can have on a community by contributing to the growing body of research on this topic and underscoring the need for technological inclusivity for all demographics, especially women.

Downloads

Published

2023-09-06

How to Cite

Aryal, S., Ngueajio, M. K., Aryal, S. K., & Washington, G. (2023). Hey, Siri! Why Are You Biased against Women? (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 37(13), 16154-16155. https://doi.org/10.1609/aaai.v37i13.26937