Extending Analogical Generalization with Near-Misses

Authors

  • Matthew McLure Northwestern University
  • Scott Friedman Smart Information Flow Technologies (SIFT)
  • Kenneth Forbus Northwestern University

DOI:

https://doi.org/10.1609/aaai.v29i1.9228

Keywords:

analogy, concept learning, near-misses, sketch recognition

Abstract

Concept learning is a central problem for cognitive systems. Generalization techniques can help organize examples by their commonalities, but comparisons with non-examples, near-misses, can provide discrimination. Early work on near-misses required hand-selected examples by a teacher who understood the learner’s internal representations. This paper introduces Analogical Learning by Integrating Generalization and Near-misses (ALIGN) and describes three key advances. First, domain-general cognitive models of analogical processes are used to handle a wider range of examples. Second, ALIGN’s analogical generalization process constructs multiple probabilistic representations per concept via clustering, and hence can learn disjunctive concepts. Finally, ALIGN uses unsupervised analogical retrieval to find its own near-miss examples. We show that ALIGN out-performs analogical generalization on two perceptual data sets: (1) hand-drawn sketches; and (2) geospatial concepts from strategy-game maps.

Downloads

Published

2015-02-10

How to Cite

McLure, M., Friedman, S., & Forbus, K. (2015). Extending Analogical Generalization with Near-Misses. Proceedings of the AAAI Conference on Artificial Intelligence, 29(1). https://doi.org/10.1609/aaai.v29i1.9228

Issue

Section

AAAI Technical Track: Cognitive Systems