Adaptive Perturbation-Based Gradient Estimation for Discrete Latent Variable Models

Authors

  • Pasquale Minervini University of Edinburgh University College London
  • Luca Franceschi University College London
  • Mathias Niepert University of Stuttgart

DOI:

https://doi.org/10.1609/aaai.v37i8.26103

Keywords:

ML: Deep Neural Network Algorithms, ML: Optimization

Abstract

The integration of discrete algorithmic components in deep learning architectures has numerous applications. Recently, Implicit Maximum Likelihood Estimation, a class of gradient estimators for discrete exponential family distributions, was proposed by combining implicit differentiation through perturbation with the path-wise gradient estimator. However, due to the finite difference approximation of the gradients, it is especially sensitive to the choice of the finite difference step size, which needs to be specified by the user. In this work, we present Adaptive IMLE (AIMLE), the first adaptive gradient estimator for complex discrete distributions: it adaptively identifies the target distribution for IMLE by trading off the density of gradient information with the degree of bias in the gradient estimates. We empirically evaluate our estimator on synthetic examples, as well as on Learning to Explain, Discrete Variational Auto-Encoders, and Neural Relational Inference tasks. In our experiments, we show that our adaptive gradient estimator can produce faithful estimates while requiring orders of magnitude fewer samples than other gradient estimators.

Downloads

Published

2023-06-26

How to Cite

Minervini, P., Franceschi, L., & Niepert, M. (2023). Adaptive Perturbation-Based Gradient Estimation for Discrete Latent Variable Models. Proceedings of the AAAI Conference on Artificial Intelligence, 37(8), 9200-9208. https://doi.org/10.1609/aaai.v37i8.26103

Issue

Section

AAAI Technical Track on Machine Learning III