Constrained Bayesian Optimization under Partial Observations: Balanced Improvements and Provable Convergence
DOI:
https://doi.org/10.1609/aaai.v38i14.29488Keywords:
ML: Optimization, CSO: Constraint Optimization, CSO: Constraint Learning and AcquisitionAbstract
The partially observable constrained optimization problems (POCOPs) impede data-driven optimization techniques since an infeasible solution of POCOPs can provide little information about the objective as well as the constraints. We endeavor to design an efficient and provable method for expensive POCOPs under the framework of constrained Bayesian optimization. Our method consists of two key components. Firstly, we present an improved design of the acquisition functions that introduce balanced exploration during optimization. We rigorously study the convergence properties of this design to demonstrate its effectiveness. Secondly, we propose Gaussian processes embedding different likelihoods as the surrogate model for partially observable constraints. This model leads to a more accurate representation of the feasible regions compared to traditional classification-based models. Our proposed method is empirically studied on both synthetic and real-world problems. The results demonstrate the competitiveness of our method for solving POCOPs.Downloads
Published
2024-03-24
How to Cite
Wang, S., & Li, K. (2024). Constrained Bayesian Optimization under Partial Observations: Balanced Improvements and Provable Convergence. Proceedings of the AAAI Conference on Artificial Intelligence, 38(14), 15607–15615. https://doi.org/10.1609/aaai.v38i14.29488
Issue
Section
AAAI Technical Track on Machine Learning V