A Coreset Learning Reality Check
DOI:
https://doi.org/10.1609/aaai.v37i7.26074Keywords:
ML: Dimensionality Reduction/Feature Selection, ML: Classification and Regression, ML: Scalability of ML SystemsAbstract
Subsampling algorithms are a natural approach to reduce data size before fitting models on massive datasets. In recent years, several works have proposed methods for subsampling rows from a data matrix while maintaining relevant information for classification. While these works are supported by theory and limited experiments, to date there has not been a comprehensive evaluation of these methods. In our work, we directly compare multiple methods for logistic regression drawn from the coreset and optimal subsampling literature and discover inconsistencies in their effectiveness. In many cases, methods do not outperform simple uniform subsampling.Downloads
Published
2023-06-26
How to Cite
Lu, F., Raff, E., & Holt, J. (2023). A Coreset Learning Reality Check. Proceedings of the AAAI Conference on Artificial Intelligence, 37(7), 8940-8948. https://doi.org/10.1609/aaai.v37i7.26074
Issue
Section
AAAI Technical Track on Machine Learning II