Sample-and-Bound for Non-convex Optimization
DOI:
https://doi.org/10.1609/aaai.v38i18.30074Keywords:
SO: Sampling/Simulation-based Search, SO: Non-convex OptimizationAbstract
Standard approaches for global optimization of non-convex functions, such as branch-and-bound, maintain partition trees to systematically prune the domain. The tree size grows exponentially in the number of dimensions. We propose new sampling-based methods for non-convex optimization that adapts Monte Carlo Tree Search (MCTS) to improve efficiency. Instead of the standard use of visitation count in Upper Confidence Bounds, we utilize numerical overapproximations of the objective as an uncertainty metric, and also take into account of sampled estimates of first-order and second-order information. The Monte Carlo tree in our approach avoids the usual fixed combinatorial patterns in growing the tree, and aggressively zooms into the promising regions, while still balancing exploration and exploitation. We evaluate the proposed algorithms on high-dimensional non-convex optimization benchmarks against competitive baselines and analyze the effects of the hyper parameters.Downloads
Published
2024-03-24
How to Cite
Zhai, Y., Qin, Z., & Gao, S. (2024). Sample-and-Bound for Non-convex Optimization. Proceedings of the AAAI Conference on Artificial Intelligence, 38(18), 20847-20855. https://doi.org/10.1609/aaai.v38i18.30074
Issue
Section
AAAI Technical Track on Search and Optimization