On the Equivalence of Linear Discriminant Analysis and Least Squares

Authors

  • Kibok Lee Samsung Electronics
  • Junmo Kim KAIST

DOI:

https://doi.org/10.1609/aaai.v29i1.9544

Keywords:

linear discriminant analysis, least squares, dimensionality reduction

Abstract

Linear discriminant analysis (LDA) is a popular dimensionality reduction and classification method that simultaneously maximizes between-class scatter and minimizes within-class scatter. In this paper, we verify the equivalence of LDA and least squares (LS) with a set of dependent variable matrices. The equivalence is in the sense that the LDA solution matrix and the LS solution matrix have the same range. The resulting LS provides an intuitive interpretation in which its solution performs data clustering according to class labels. Further, the fact that LDA and LS have the same range allows us to design a two-stage algorithm that computes the LDA solution given by generalized eigenvalue decomposition (GEVD), much faster than computing the original GEVD. Experimental results demonstrate the equivalence of the LDA solution and the proposed LS solution.

Downloads

Published

2015-02-21

How to Cite

Lee, K., & Kim, J. (2015). On the Equivalence of Linear Discriminant Analysis and Least Squares. Proceedings of the AAAI Conference on Artificial Intelligence, 29(1). https://doi.org/10.1609/aaai.v29i1.9544

Issue

Section

Main Track: Novel Machine Learning Algorithms