Explicitly Imposing Constraints in Deep Networks via Conditional Gradients Gives Improved Generalization and Faster Convergence

Authors

  • Sathya N. Ravi University of Wisconsin-Madison
  • Tuan Dinh University of Wisconsin-Madison
  • Vishnu Suresh Lokhande University of Wisconsin-Madison
  • Vikas Singh University of Wisconsin-Madison

DOI:

https://doi.org/10.1609/aaai.v33i01.33014772

Abstract

A number of results have recently demonstrated the benefits of incorporating various constraints when training deep architectures in vision and machine learning. The advantages range from guarantees for statistical generalization to better accuracy to compression. But support for general constraints within widely used libraries remains scarce and their broader deployment within many applications that can benefit from them remains under-explored. Part of the reason is that Stochastic gradient descent (SGD), the workhorse for training deep neural networks, does not natively deal with constraints with global scope very well. In this paper, we revisit a classical first order scheme from numerical optimization, Conditional Gradients (CG), that has, thus far had limited applicability in training deep models. We show via rigorous analysis how various constraints can be naturally handled by modifications of this algorithm. We provide convergence guarantees and show a suite of immediate benefits that are possible — from training ResNets with fewer layers but better accuracy simply by substituting in our version of CG to faster training of GANs with 50% fewer epochs in image inpainting applications to provably better generalization guarantees using efficiently implementable forms of recently proposed regularizers.

Downloads

Published

2019-07-17

How to Cite

Ravi, S. N., Dinh, T., Lokhande, V. S., & Singh, V. (2019). Explicitly Imposing Constraints in Deep Networks via Conditional Gradients Gives Improved Generalization and Faster Convergence. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 4772-4779. https://doi.org/10.1609/aaai.v33i01.33014772

Issue

Section

AAAI Technical Track: Machine Learning