TY - JOUR AU - Ramakrishnan, Ramya AU - Kamar, Ece AU - Nushi, Besmira AU - Dey, Debadeepta AU - Shah, Julie AU - Horvitz, Eric PY - 2019/07/17 Y2 - 2024/03/29 TI - Overcoming Blind Spots in the Real World: Leveraging Complementary Abilities for Joint Execution JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 33 IS - 01 SE - AAAI Technical Track: Multiagent Systems DO - 10.1609/aaai.v33i01.33016137 UR - https://ojs.aaai.org/index.php/AAAI/article/view/4571 SP - 6137-6145 AB - <p>Simulators are being increasingly used to train agents before deploying them in real-world environments. While training in simulation provides a cost-effective way to learn, poorly modeled aspects of the simulator can lead to costly mistakes, or blind spots. While humans can help guide an agent towards identifying these error regions, humans themselves have blind spots and noise in execution. We study how learning about blind spots of both can be used to manage hand-off decisions when humans and agents jointly act in the real-world in which neither of them are trained or evaluated fully. The formulation assumes that agent blind spots result from representational limitations in the simulation world, which leads the agent to ignore important features that are relevant for acting in the open world. Our approach for blind spot discovery combines experiences collected in simulation with limited human demonstrations. The first step applies imitation learning to demonstration data to identify important features that the human is using but that the agent is missing. The second step uses noisy labels extracted from action mismatches between the agent and the human across simulation and demonstration data to train blind spot models. We show through experiments on two domains that our approach is able to learn a succinct representation that accurately captures blind spot regions and avoids dangerous errors in the real world through transfer of control between the agent and the human.</p> ER -