Learning Parameterized Task Structure for Generalization to Unseen Entities

Authors

  • Anthony Liu University of Michigan
  • Sungryull Sohn LG AI Research
  • Mahdi Qazwini University of Michigan
  • Honglak Lee LG AI Research / University of Michigan

DOI:

https://doi.org/10.1609/aaai.v36i7.20718

Keywords:

Machine Learning (ML), Knowledge Representation And Reasoning (KRR), Planning, Routing, And Scheduling (PRS)

Abstract

Real world tasks are hierarchical and compositional. Tasks can be composed of multiple subtasks (or sub-goals) that are dependent on each other. These subtasks are defined in terms of entities (e.g., "apple", "pear") that can be recombined to form new subtasks (e.g., "pickup apple", and "pickup pear"). To solve these tasks efficiently, an agent must infer subtask dependencies (e.g. an agent must execute "pickup apple" before "place apple in pot"), and generalize the inferred dependencies to new subtasks (e.g. "place apple in pot" is similar to "place apple in pan"). Moreover, an agent may also need to solve unseen tasks, which can involve unseen entities. To this end, we formulate parameterized subtask graph inference (PSGI), a method for modeling subtask dependencies using first-order logic with factored entities. To facilitate this, we learn parameter attributes in a zero-shot manner, which are used as quantifiers (e.g. is_pickable(X)) for the factored subtask graph. We show this approach accurately learns the latent structure on hierarchical and compositional tasks more efficiently than prior work, and show PSGI can generalize by modelling structure on subtasks unseen during adaptation.

Downloads

Published

2022-06-28

How to Cite

Liu, A., Sohn, S., Qazwini, M., & Lee, H. (2022). Learning Parameterized Task Structure for Generalization to Unseen Entities. Proceedings of the AAAI Conference on Artificial Intelligence, 36(7), 7534–7541. https://doi.org/10.1609/aaai.v36i7.20718

Issue

Section

AAAI Technical Track on Machine Learning II