Scalable Acceleration for Classification-Based Derivative-Free Optimization

Authors

  • Tianyi Han Beijing Supreium Technology, Haidian District, Beijing
  • Jingya Li Beijing Supreium Technology, Haidian District, Beijing
  • Zhipeng Guo Beijing Supreium Technology, Haidian District, Beijing
  • Yuan Jin Beijing Supreium Technology, Haidian District, Beijing

DOI:

https://doi.org/10.1609/aaai.v39i16.33874

Abstract

Derivative-free optimization algorithms play an important role in scientific and engineering design optimization problems, especially when derivative information is not accessible. In this paper, we study the framework of sequential classification-based derivative-free optimization algorithms. By introducing learning theoretic concept hypothesis-target shattering rate, we revisit the computational complexity upper bound of SRACOS Inspired by the revisited upper bound, we propose an algorithm named RACE-CARS, which adds a random region-shrinking step compared with SRACOS. We further establish theorems showing the acceleration by region shrinking. Experiments on the synthetic functions as well as black-box tuning for language-model-as-a-service demonstrate empirically the efficiency of RACE-CARS. An ablation experiment on the introduced hyper-parameters is also conducted, revealing the mechanism of RACE-CARS and putting forward an empirical hyperparameter tuning guidance.

Downloads

Published

2025-04-11

How to Cite

Han, T., Li, J., Guo, Z., & Jin, Y. (2025). Scalable Acceleration for Classification-Based Derivative-Free Optimization. Proceedings of the AAAI Conference on Artificial Intelligence, 39(16), 17050–17058. https://doi.org/10.1609/aaai.v39i16.33874

Issue

Section

AAAI Technical Track on Machine Learning II