TEACh: Task-Driven Embodied Agents That Chat

Authors

  • Aishwarya Padmakumar Amazon Alexa AI
  • Jesse Thomason University of Southern California Amazon Alexa AI
  • Ayush Shrivastava University of Michigan
  • Patrick Lange Amazon Alexa AI
  • Anjali Narayan-Chen Amazon Alexa AI
  • Spandana Gella Amazon Alexa AI
  • Robinson Piramuthu Amazon Alexa AI
  • Gokhan Tur Amazon Alexa AI
  • Dilek Hakkani-Tur Amazon Alexa AI

DOI:

https://doi.org/10.1609/aaai.v36i2.20097

Keywords:

Computer Vision (CV), Speech & Natural Language Processing (SNLP)

Abstract

Robots operating in human spaces must be able to engage in natural language interaction, both understanding and executing instructions, and using conversation to resolve ambiguity and correct mistakes. To study this, we introduce TEACh, a dataset of over 3,000 human-human, interactive dialogues to complete household tasks in simulation. A Commander with access to oracle information about a task communicates in natural language with a Follower. The Follower navigates through and interacts with the environment to complete tasks varying in complexity from "Make Coffee" to "Prepare Breakfast", asking questions and getting additional information from the Commander. We propose three benchmarks using TEACh to study embodied intelligence challenges, and we evaluate initial models' abilities in dialogue understanding, language grounding, and task execution.

Downloads

Published

2022-06-28

How to Cite

Padmakumar, A., Thomason, J., Shrivastava, A., Lange, P., Narayan-Chen, A., Gella, S., Piramuthu, R., Tur, G., & Hakkani-Tur, D. (2022). TEACh: Task-Driven Embodied Agents That Chat. Proceedings of the AAAI Conference on Artificial Intelligence, 36(2), 2017-2025. https://doi.org/10.1609/aaai.v36i2.20097

Issue

Section

AAAI Technical Track on Computer Vision II