Symbols as a Lingua Franca for Bridging Human-AI Chasm for Explainable and Advisable AI Systems


  • Subbarao Kambhampati Arizona State University
  • Sarath Sreedharan Arizona State University
  • Mudit Verma Arizona State University
  • Yantian Zha Arizona State University
  • Lin Guan Arizona State University



Explainable AI, Human-AI Interaction, Neuro-symbolic AI


Despite the surprising power of many modern AI systems that often learn their own representations, there is significant discontent about their inscrutability and the attendant problems in their ability to interact with humans. While alternatives such as neuro-symbolic approaches have been proposed, there is a lack of consensus on what they are about. There are often two independent motivations (i) symbols as a lingua franca for human-AI interaction and (ii) symbols as (system-produced) abstractions use in its internal reasoning. The jury is still out on whether AI systems will need to use symbols in their internal reasoning to achieve general intelligence capabilities. Whatever the answer there is, the need for (human-understandable) symbols in human-AI interaction seems quite compelling. Symbols, like emotions, may well not be sine qua non for intelligence per se, but they will be crucial for AI systems to interact with us humans--as we can neither turn off our emotions not get by without our symbols. In particular, in many human-designed domains, humans would be interested in providing explicit (symbolic) knowledge and advice--and expect machine explanations in kind. This alone requires AI systems to at least do their I/O in symbolic terms. In this blue sky paper, we argue this point of view, and discuss research directions that need to be pursued to allow for this type of human-AI interaction.




How to Cite

Kambhampati, S., Sreedharan, S., Verma, M., Zha, Y., & Guan, L. (2022). Symbols as a Lingua Franca for Bridging Human-AI Chasm for Explainable and Advisable AI Systems. Proceedings of the AAAI Conference on Artificial Intelligence, 36(11), 12262-12267.