Constructing Deep Concepts through Shallow Search

Authors

  • Bonan Zhao Princeton University
  • Christopher G. Lucas University of Edinburgh
  • Neil R. Bramley University of Edinburgh

DOI:

https://doi.org/10.1609/aaaiss.v3i1.31292

Keywords:

Concept Learning, Inductive Generalization, Bayesian Cognitive Modeling, Library Learning, Resource Rationality, Bootstrapping

Abstract

We propose bootstrap learning as a computational account for why human learning is modular and incremental, and identify key components of bootstrap learning that allow artificial systems to learn more like people. Originated from developmental psychology, bootstrap learning refers to people's ability to extend and repurpose existing knowledge to create new and more powerful ideas. We view bootstrap learning as a solution of how cognitively-bounded reasoners grasp complex environmental dynamics that are far beyond their initial capacity, by searching ‘locally’ and recursively to extend their existing knowledge. Drawing from techniques of Bayesian library learning and resource rational analysis, we propose a computational modeling framework that achieves human-like bootstrap learning performance in inductive conceptual inference. In addition, we demonstrate modeling and behavioral evidence that highlights the double-edged sword of bootstrap learning, such that people processing the same information in different batch orders could induce drastically different causal conclusions and generalizations, as a result of the different sub-concepts they construct in earlier stages of learning.

Downloads

Published

2024-05-20

Issue

Section

Symposium on Human-Like Learning