Precision-based Boosting
DOI:
https://doi.org/10.1609/aaai.v35i10.17105Keywords:
Ensemble Methods, Classification and RegressionAbstract
AdaBoost is a highly popular ensemble classification method for which many variants have been published. This paper proposes a generic refinement of all of these AdaBoost variants. Instead of assigning weights based on the total error of the base classifiers (as in AdaBoost), our method uses class-specific error rates. On instance x it assigns a higher weight to a classifier predicting label y on x, if that classifier is less likely to make a mistake when it predicts class y. Like AdaBoost, our method is guaranteed to boost weak learners into strong learners. An empirical study on AdaBoost and one of its multi-class versions, SAMME, demonstrates the superiority of our method on datasets with more than 1,000 instances as well as on datasets with more than three classes.Downloads
Published
2021-05-18
How to Cite
Nikravan, M. H., Movahedan, M., & Zilles, S. (2021). Precision-based Boosting. Proceedings of the AAAI Conference on Artificial Intelligence, 35(10), 9153-9160. https://doi.org/10.1609/aaai.v35i10.17105
Issue
Section
AAAI Technical Track on Machine Learning III