Adaptive and Asymptotic Mean-based Subclass Discriminant Analysis
DOI:
https://doi.org/10.1609/aaai.v40i25.39253Abstract
Traditional Discriminant analysis (DA) is one of the classical supervised learning algorithms to reduce the dimensionality of data with Gaussian assumption. Since the unique class mean in traditional DA is intractable to estimate the non-Gaussian distrbution of data, some existing DA algorithms based on the clustering criterion focus on learning multiple means in each class so as to address the non-Gaussian issue. The clustering-based DA inevitably involved the constraint optimization problem to learn multiple means, which may lead to the locally optimal solution. To address these issues, inspired by the smooth approximation theory and the concept of Kolmogorov mean, this paper explores an unconstraint function with asymptotic property as an alternative proxy to clustering-based DA algorithms. Thus the derived DA algorithm, i.e., adaptive and asymptotic mean-based subclass discriminant analysis (AASDA), which not only leverages multiple means to represent different subclasses in same class but also adaptively and asymptotically learns the similar mean for each sample in the learned optimal subspace via the gradient-based optimizer. The asymptotic analysis of unconstraint function, the gradient analysis and convergence guarantee of proposed criterion verify the effectiveness of AASDA algorithm. Its merits are thoroughly assessed on a suite of synthetic and real world data experiments.Downloads
Published
2026-03-14
How to Cite
Feng, Y., Gao, Y., & Nie, F. (2026). Adaptive and Asymptotic Mean-based Subclass Discriminant Analysis. Proceedings of the AAAI Conference on Artificial Intelligence, 40(25), 21101–21110. https://doi.org/10.1609/aaai.v40i25.39253
Issue
Section
AAAI Technical Track on Machine Learning II