CR-SAM: Curvature Regularized Sharpness-Aware Minimization
DOI:
https://doi.org/10.1609/aaai.v38i6.28431Keywords:
CV: Learning & Optimization for CV, ML: OptimizationAbstract
The capacity to generalize to future unseen data stands as one of the utmost crucial attributes of deep neural networks. Sharpness-Aware Minimization (SAM) aims to enhance the generalizability by minimizing worst-case loss using one-step gradient ascent as an approximation. However, as training progresses, the non-linearity of the loss landscape increases, rendering one-step gradient ascent less effective. On the other hand, multi-step gradient ascent will incur higher training cost. In this paper, we introduce a normalized Hessian trace to accurately measure the curvature of loss landscape on both training and test sets. In particular, to counter excessive non-linearity of loss landscape, we propose Curvature Regularized SAM (CR-SAM), integrating the normalized Hessian trace as a SAM regularizer. Additionally, we present an efficient way to compute the trace via finite differences with parallelism. Our theoretical analysis based on PAC-Bayes bounds establishes the regularizer's efficacy in reducing generalization error. Empirical evaluation on CIFAR and ImageNet datasets shows that CR-SAM consistently enhances classification performance for ResNet and Vision Transformer (ViT) models across various datasets. Our code is available at https://github.com/TrustAIoT/CR-SAM.Downloads
Published
2024-03-24
How to Cite
Wu, T., Luo, T., & Wunsch II, D. C. (2024). CR-SAM: Curvature Regularized Sharpness-Aware Minimization. Proceedings of the AAAI Conference on Artificial Intelligence, 38(6), 6144-6152. https://doi.org/10.1609/aaai.v38i6.28431
Issue
Section
AAAI Technical Track on Computer Vision V