Blessing of Dimensionality for Approximating Sobolev Classes on Manifolds

Authors

  • Hong Ye Tan University of California, Los Angeles University of Cambridge
  • Subhadip Mukherjee Indian Institute of Technology Kharagpur
  • Junqi Tang University of Birmingham
  • Carola-Bibiane Schönlieb University of Cambridge

DOI:

https://doi.org/10.1609/aaai.v40i30.39774

Abstract

The manifold hypothesis says that natural high-dimensional data lie on or around a low-dimensional manifold. The recent success of statistical and learning-based methods in very high dimensions empirically supports this hypothesis, suggesting that typical worst-case analysis does not provide practical guarantees. A natural step for analysis is thus to assume the manifold hypothesis and derive bounds that are independent of any ambient dimensions that the data may be embedded in. Theoretical implications in this direction have recently been explored in terms of generalization of ReLU networks and convergence of Langevin methods. In this work, we consider optimal uniform approximations with functions of finite statistical complexity. While upper bounds on uniform approximation exist in the literature using ReLU neural networks, we consider the opposite: lower bounds to quantify the fundamental difficulty of approximation on manifolds. In particular, we demonstrate that the statistical complexity required to approximate a class of bounded Sobolev functions on a compact manifold is bounded from below, and moreover that this bound is dependent only on the intrinsic properties of the manifold, such as curvature, volume, and injectivity radius.

Downloads

Published

2026-03-14

How to Cite

Tan, H. Y., Mukherjee, S., Tang, J., & Schönlieb, C.-B. (2026). Blessing of Dimensionality for Approximating Sobolev Classes on Manifolds. Proceedings of the AAAI Conference on Artificial Intelligence, 40(30), 25761–25769. https://doi.org/10.1609/aaai.v40i30.39774

Issue

Section

AAAI Technical Track on Machine Learning VII