OPT-GAN: A Broad-Spectrum Global Optimizer for Black-Box Problems by Learning Distribution
DOI:
https://doi.org/10.1609/aaai.v37i10.26468Keywords:
SO: Evolutionary Computation, ML: Adversarial Learning & Robustness, ML: Deep Generative Models & Autoencoders, ML: Optimization, SO: Heuristic SearchAbstract
Black-box optimization (BBO) algorithms are concerned with finding the best solutions for problems with missing analytical details. Most classical methods for such problems are based on strong and fixed a priori assumptions, such as Gaussianity. However, the complex real-world problems, especially when the global optimum is desired, could be very far from the a priori assumptions because of their diversities, causing unexpected obstacles. In this study, we propose a generative adversarial net-based broad-spectrum global optimizer (OPT-GAN) which estimates the distribution of optimum gradually, with strategies to balance exploration-exploitation trade-off. It has potential to better adapt to the regularity and structure of diversified landscapes than other methods with fixed prior, e.g., Gaussian assumption or separability. Experiments on diverse BBO benchmarks and high dimensional real world applications exhibit that OPT-GAN outperforms other traditional and neural net-based BBO algorithms. The code and Appendix are available at https://github.com/NBICLAB/OPT-GANDownloads
Published
2023-06-26
How to Cite
Lu, M., Ning, S., Liu, S., Sun, F., Zhang, B., Yang, B., & Wang, L. (2023). OPT-GAN: A Broad-Spectrum Global Optimizer for Black-Box Problems by Learning Distribution. Proceedings of the AAAI Conference on Artificial Intelligence, 37(10), 12462-12472. https://doi.org/10.1609/aaai.v37i10.26468
Issue
Section
AAAI Technical Track on Search and Optimization