Towards an Ontology for Generating Behaviors for Socially Assistive Robots Helping Young Children

Authors

  • Yuqi Yang Franklin & Marshall College
  • Allison Langer Temple University
  • Lauren Howard Franklin & Marshall College
  • Peter J. Marshall Temple University
  • Jason R. Wilson Franklin & Marshall College

DOI:

https://doi.org/10.1609/aaaiss.v2i1.27674

Keywords:

Human-Robot Interaction, Child-Robot Interaction, Socially Assistive Robot, Upper Ontology, Hierarchical Task Network

Abstract

Socially assistive robots (SARs) have the potential to revolutionize educational experiences by providing safe, non-judgmental, and emotionally supportive environments for children's social development. The success of SARs relies on the synergy of different modalities, such as speech, gestures, and gaze, to maximize interactive experiences. This paper presents an approach for generating SAR behaviors that extend an upper ontology. The ontology may enable flexibility and scalability for adaptive behavior generation by defining key assistive intents, turn-taking, and input properties. We compare the generated behaviors with hand-coded behaviors that are validated through an experiment with young children. The results demonstrate that the automated approach covers the majority of manually developed behaviors while allowing for significant adaptations to specific circumstances. The technical framework holds the potential for broader interoperability in other assistive domains and facilitates the generation of context-dependent and socially appropriate robot behaviors.

Downloads

Published

2024-01-22

Issue

Section

Artificial Intelligence for Human-Robot Interaction (AI-HRI)