Bayesian Optimization over Permutation Spaces


  • Aryan Deshwal Washington state university
  • Syrine Belakaria Washington State university
  • Janardhan Rao Doppa Washington State University
  • Dae Hyun Kim Washington State University



Machine Learning (ML), Reasoning Under Uncertainty (RU), Search And Optimization (SO)


Optimizing expensive to evaluate black-box functions over an input space consisting of all permutations of d objects is an important problem with many real-world applications. For example, placement of functional blocks in hardware design to optimize performance via simulations. The overall goal is to minimize the number of function evaluations to find high-performing permutations. The key challenge in solving this problem using the Bayesian optimization (BO) framework is to trade-off the complexity of statistical model and tractability of acquisition function optimization. In this paper, we propose and evaluate two algorithms for BO over Permutation Spaces (BOPS). First, BOPS-T employs Gaussian process (GP) surrogate model with Kendall kernels and a Tractable acquisition function optimization approach to select the sequence of permutations for evaluation. Second, BOPS-H employs GP surrogate model with Mallow kernels and a Heuristic search approach to optimize the acquisition function. We theoretically analyze the performance of BOPS-T to show that their regret grows sub-linearly. Our experiments on multiple synthetic and real-world benchmarks show that both BOPS-T and BOPS-H perform better than the state-of-the-art BO algorithm for combinatorial spaces. To drive future research on this important problem, we make new resources and real-world benchmarks available to the community.




How to Cite

Deshwal, A., Belakaria, S., Doppa, J. R., & Kim, D. H. (2022). Bayesian Optimization over Permutation Spaces. Proceedings of the AAAI Conference on Artificial Intelligence, 36(6), 6515-6523.



AAAI Technical Track on Machine Learning I