Experimental Design for Optimization of Orthogonal Projection Pursuit Models

Authors

  • Mojmir Mutny ETH Zurich
  • Johannes Kirschner ETH Zurich
  • Andreas Krause ETH Zurich

DOI:

https://doi.org/10.1609/aaai.v34i06.6585

Abstract

Bayesian optimization and kernelized bandit algorithms are widely used techniques for sequential black box function optimization with applications in parameter tuning, control, robotics among many others. To be effective in high dimensional settings, previous approaches make additional assumptions, for example on low-dimensional subspaces or an additive structure. In this work, we go beyond the additivity assumption and use an orthogonal projection pursuit regression model, which strictly generalizes additive models. We present a two-stage algorithm motivated by experimental design to first decorrelate the additive components. Subsequently, the bandit optimization benefits from the statistically efficient additive model. Our method provably decorrelates the fully additive model and achieves optimal sublinear simple regret in terms of the number of function evaluations. To prove the rotation recovery, we derive novel concentration inequalities for linear regression on subspaces. In addition, we specifically address the issue of acquisition function optimization and present two domain dependent efficient algorithms. We validate the algorithm numerically on synthetic as well as real-world optimization problems.

Downloads

Published

2020-04-03

How to Cite

Mutny, M., Kirschner, J., & Krause, A. (2020). Experimental Design for Optimization of Orthogonal Projection Pursuit Models. Proceedings of the AAAI Conference on Artificial Intelligence, 34(06), 10235-10242. https://doi.org/10.1609/aaai.v34i06.6585

Issue

Section

AAAI Technical Track: Reasoning under Uncertainty