Function-on-Function Bayesian Optimization

Authors

  • Jingru Huang Department of Industrial Engineering, Tsinghua University
  • Haijie Xu Department of Industrial Engineering, Tsinghua University
  • Manrui Jiang Department of Industrial Engineering, Tsinghua University
  • Chen Zhang Department of Industrial Engineering, Tsinghua University

DOI:

https://doi.org/10.1609/aaai.v40i26.39353

Abstract

Bayesian optimization (BO) has been widely used to optimize expensive and gradient-free objective functions across various domains. However, existing BO methods have not addressed the objective where both inputs and outputs are functions, which increasingly arise in complex systems as advanced sensing technologies. To fill this gap, we propose a novel function-on-function Bayesian optimization (FFBO) framework. Specifically, we first introduce a function-on-function Gaussian process (FFGP) model with a separable operator-valued kernel to capture the correlations between function-valued inputs and outputs. Compared to existing Gaussian process models, FFGP is modeled directly in the function space. Based on FFGP, we define a scalar upper confidence bound (UCB) acquisition function using a weighted operator-based scalarization strategy. Then, a scalable functional gradient ascent algorithm (FGA) is developed to efficiently identify the optimal function-valued input. We further analyze the theoretical properties of the proposed method. Extensive experiments on synthetic and real-world data demonstrate the superior performance of FFBO over existing approaches.

Downloads

Published

2026-03-14

How to Cite

Huang, J., Xu, H., Jiang, M., & Zhang, C. (2026). Function-on-Function Bayesian Optimization. Proceedings of the AAAI Conference on Artificial Intelligence, 40(26), 21994–22002. https://doi.org/10.1609/aaai.v40i26.39353

Issue

Section

AAAI Technical Track on Machine Learning III