EDDIE: An Embodied AI System for Research and Intervention for Individuals with ASD

Authors

  • Robert Selkowitz Canisius College
  • Jonathan Rodgers Canisius College
  • P. Moskal Canisius College
  • Jon Mrowczynski Canisius College
  • Christine Colson Canisius College

DOI:

https://doi.org/10.1609/aaai.v30i1.9845

Abstract

We report on the ongoing development of EDDIE (Emotion Demonstration, Decoding, Interpretation, and Encoding), an interactive embodied AI to be deployed as an intervention system for children diagnosed with High-Functioning Autism Spectrum Disorders (HFASD). EDDIE presents the subject with interactive requests to decode facial expressions presented through an avatar, encode requested expressions, or do both in a single session. Facial tracking software interprets the subject’s response, and allows for immediate feedback. The system fills a need in research and intervention for children with HFASD by providing an engaging platform for presentation of exemplar expressions consistent with mechanical systems of facial action measurement integrated with an automatic system for interpreting and giving feedback to the subject’s expressions. Both live interaction with EDDIE and video recordings of human-EDDIE interaction will be demonstrated.

Downloads

Published

2016-03-05

How to Cite

Selkowitz, R., Rodgers, J., Moskal, P., Mrowczynski, J., & Colson, C. (2016). EDDIE: An Embodied AI System for Research and Intervention for Individuals with ASD. Proceedings of the AAAI Conference on Artificial Intelligence, 30(1). https://doi.org/10.1609/aaai.v30i1.9845