Efficient Dynamic Batch Adaptation (Student Abstract)
DOI:
https://doi.org/10.1609/aaai.v37i13.27024Keywords:
Online Batch Selection, Few Shot Learning, OptimizationAbstract
In this paper we introduce Efficient Dynamic Batch Adaptation (EDBA), which improves on a previous method that works by adjusting the composition and the size of the current batch. Our improvements allow for Dynamic Batch Adaptation to feasibly scale up for bigger models and datasets, drastically improving model convergence and generalization. We show how the method is still able to perform especially well in data-scarce scenarios, managing to obtain a test accuracy on 100 samples of CIFAR-10 of 90.68%, while the baseline only reaches 23.79%. On the full CIFAR-10 dataset, EDBA reaches convergence in ∼120 epochs while the baseline requires ∼300 epochs.Downloads
Published
2023-09-06
How to Cite
Simionescu, C., & Stoica, G. (2023). Efficient Dynamic Batch Adaptation (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 37(13), 16328-16329. https://doi.org/10.1609/aaai.v37i13.27024
Issue
Section
AAAI Student Abstract and Poster Program