Atlas of AI Risks: Enhancing Public Understanding of AI Risks

Authors

  • Edyta Bogucka Nokia Bell Labs
  • Sanja Šćepanović Nokia Bell Labs
  • Daniele Quercia Nokia Bell Labs King's College London

DOI:

https://doi.org/10.1609/hcomp.v12i1.31598

Abstract

The prevailing methodologies for visualizing AI risks have focused on technical issues such as data biases and model inaccuracies, often overlooking broader societal risks like job loss and surveillance. Moreover, these visualizations are typically designed for tech-savvy individuals, neglecting those with limited technical skills. To address these challenges, we propose the Atlas of AI Risks—a narrative-style tool designed to map the broad risks associated with various AI technologies in a way that is understandable to non-technical individuals as well. To both develop and evaluate this tool, we conducted two crowdsourcing studies. The first, involving 40 participants, identified the design requirements for visualizing AI risks for decision-making and guided the development of the Atlas. The second study, with 140 participants reflecting the US population in terms of age, sex, and ethnicity, assessed the usability and aesthetics of the Atlas to ensure it met those requirements. Using facial recognition technology as a case study, we found that the Atlas is more user-friendly than a baseline visualization, with a more classic and expressive aesthetic, and is more effective in presenting a balanced assessment of the risks and benefits of facial recognition. Finally, we discuss how our design choices make the Atlas adaptable for broader use, allowing it to generalize across the diverse range of technology applications represented in a database that reports various AI incidents.

Downloads

Published

2024-10-14

How to Cite

Bogucka, E., Šćepanović, S., & Quercia, D. (2024). Atlas of AI Risks: Enhancing Public Understanding of AI Risks. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 12(1), 33-43. https://doi.org/10.1609/hcomp.v12i1.31598