Scaling Neural Program Synthesis with Distribution-Based Search

Authors

  • Nathanaël Fijalkow CNRS, LaBRI and Université de Bordeaux, France, The Alan Turing Institute of data science, United Kingdom
  • Guillaume Lagarde CNRS, LaBRI and Université de Bordeaux, France
  • Théo Matricon CNRS, LaBRI and Université de Bordeaux, France
  • Kevin Ellis Cornell University, United States
  • Pierre Ohlmann University of Paris, France
  • Akarsh Nayan Potta Indian Institute of Technology Bombay, India

DOI:

https://doi.org/10.1609/aaai.v36i6.20616

Keywords:

Machine Learning (ML)

Abstract

We consider the problem of automatically constructing computer programs from input-output examples. We investigate how to augment probabilistic and neural program synthesis methods with new search algorithms, proposing a framework called distribution-based search. Within this framework, we introduce two new search algorithms: Heap Search, an enumerative method, and SQRT Sampling, a probabilistic method. We prove certain optimality guarantees for both methods, show how they integrate with probabilistic and neural techniques, and demonstrate how they can operate at scale across parallel compute environments. Collectively these findings offer theoretical and applied studies of search algorithms for program synthesis that integrate with recent developments in machine-learned program synthesizers.

Downloads

Published

2022-06-28

How to Cite

Fijalkow, N., Lagarde, G., Matricon, T., Ellis, K., Ohlmann, P., & Potta, A. N. (2022). Scaling Neural Program Synthesis with Distribution-Based Search. Proceedings of the AAAI Conference on Artificial Intelligence, 36(6), 6623-6630. https://doi.org/10.1609/aaai.v36i6.20616

Issue

Section

AAAI Technical Track on Machine Learning I