Kernelized Normalizing Constant Estimation: Bridging Bayesian Quadrature and Bayesian Optimization

Authors

  • Xu Cai National University of Singapore
  • Jonathan Scarlett National University of Singapore

DOI:

https://doi.org/10.1609/aaai.v38i10.28992

Keywords:

ML: Bayesian Learning, ML: Information Theory, ML: Kernel Methods, ML: Online Learning & Bandits

Abstract

In this paper, we study the problem of estimating the normalizing constant through queries to the black-box function f, which is the integration of the exponential function of f scaled by a problem parameter lambda. We assume f belongs to a reproducing kernel Hilbert space (RKHS), and show that to estimate the normalizing constant within a small relative error, the level of difficulty depends on the value of lambda: When lambda approaches zero, the problem is similar to Bayesian quadrature (BQ), while when lambda approaches infinity, the problem is similar to Bayesian optimization (BO). More generally, the problem varies between BQ and BO. We find that this pattern holds true even when the function evaluations are noisy, bringing new aspects to this topic. Our findings are supported by both algorithm-independent lower bounds and algorithmic upper bounds, as well as simulation studies conducted on a variety of benchmark functions.

Downloads

Published

2024-03-24

How to Cite

Cai, X., & Scarlett, J. (2024). Kernelized Normalizing Constant Estimation: Bridging Bayesian Quadrature and Bayesian Optimization. Proceedings of the AAAI Conference on Artificial Intelligence, 38(10), 11150-11158. https://doi.org/10.1609/aaai.v38i10.28992

Issue

Section

AAAI Technical Track on Machine Learning I