TY - JOUR AU - Honeycutt, Donald AU - Nourani, Mahsan AU - Ragan, Eric PY - 2020/10/01 Y2 - 2024/03/28 TI - Soliciting Human-in-the-Loop User Feedback for Interactive Machine Learning Reduces User Trust and Impressions of Model Accuracy JF - Proceedings of the AAAI Conference on Human Computation and Crowdsourcing JA - HCOMP VL - 8 IS - 1 SE - Full Papers DO - 10.1609/hcomp.v8i1.7464 UR - https://ojs.aaai.org/index.php/HCOMP/article/view/7464 SP - 63-72 AB - <p class="abstract">Mixed-initiative systems allow users to interactively provide feedback to potentially improve system performance. Human feedback can correct model errors and update model parameters to dynamically adapt to changing data. Additionally, many users desire the ability to have a greater level of control and fix perceived flaws in systems they rely on. However, how the ability to provide feedback to autonomous systems influences user trust is a largely unexplored area of research. Our research investigates how the act of providing feedback can affect user understanding of an intelligent system and its accuracy. We present a controlled experiment using a simulated object detection system with image data to study the effects of interactive feedback collection on user impressions. The results show that providing human-in-the-loop feedback lowered both participants’ trust in the system and their perception of system accuracy, regardless of whether the system accuracy improved in response to their feedback. These results highlight the importance of considering the effects of allowing end-user feedback on user trust when designing intelligent systems.</p> ER -