On Solution Functions of Optimization: Universal Approximation and Covering Number Bounds

Authors

  • Ming Jin Virginia Tech
  • Vanshaj Khattar Virginia Tech
  • Harshal Kaushik Virginia Tech
  • Bilgehan Sel Virginia Tech
  • Ruoxi Jia Virginia Tech

DOI:

https://doi.org/10.1609/aaai.v37i7.25981

Keywords:

ML: Optimization, ML: Learning Theory, ML: Other Foundations of Machine Learning

Abstract

We study the expressibility and learnability of solution functions of convex optimization and their multi-layer architectural extension. The main results are: (1) the class of solution functions of linear programming (LP) and quadratic programming (QP) is a universal approximant for the smooth model class or some restricted Sobolev space, and we characterize the rate-distortion, (2) the approximation power is investigated through a viewpoint of regression error, where information about the target function is provided in terms of data observations, (3) compositionality in the form of deep architecture with optimization as a layer is shown to reconstruct some basic functions used in numerical analysis without error, which implies that (4) a substantial reduction in rate-distortion can be achieved with a universal network architecture, and (5) we discuss the statistical bounds of empirical covering numbers for LP/QP, as well as a generic optimization problem (possibly nonconvex) by exploiting tame geometry. Our results provide the **first rigorous analysis of the approximation and learning-theoretic properties of solution functions** with implications for algorithmic design and performance guarantees.

Downloads

Published

2023-06-26

How to Cite

Jin, M., Khattar, V., Kaushik, H., Sel, B., & Jia, R. (2023). On Solution Functions of Optimization: Universal Approximation and Covering Number Bounds. Proceedings of the AAAI Conference on Artificial Intelligence, 37(7), 8123-8131. https://doi.org/10.1609/aaai.v37i7.25981

Issue

Section

AAAI Technical Track on Machine Learning II