The Role of Over-Parameterization in Machine Learning – the Good, the Bad, the Ugly

Authors

  • Fanghui Liu University of Warwick

DOI:

https://doi.org/10.1609/aaai.v38i20.30290

Keywords:

Learning Theory, Over-parameterization, Neural Networks, Function Space

Abstract

The conventional wisdom of simple models in machine learning misses the bigger picture, especially over-parameterized neural networks (NNs), where the number of parameters are much larger than the number of training data. Our goal is to explore the mystery behind over-parameterized models from a theoretical side. In this talk, I will discuss the role of over-parameterization in neural networks, to theoretically understand why they can perform well. First, I will discuss the role of over-parameterization in neural networks from the perspective of models, to theoretically understand why they can genralize well. Second, the effects of over-parameterization in robustness, privacy are discussed. Third, I will talk about the over-parameterization from kernel methods to neural networks in a function space theory view. Besides, from classical statistical learning to sequential decision making, I will talk about the benefits of over-parameterization on how deep reinforcement learning works well for function approximation. Potential future directions on theory of over-parameterization ML will also be discussed.

Published

2024-03-24

How to Cite

Liu, F. (2024). The Role of Over-Parameterization in Machine Learning – the Good, the Bad, the Ugly. Proceedings of the AAAI Conference on Artificial Intelligence, 38(20), 22674-22674. https://doi.org/10.1609/aaai.v38i20.30290