Efficiently Learning a Distance Metric for Large Margin Nearest Neighbor Classification

Authors

  • Kyoungup Park The Australian National University and NICTA
  • Chunhua Shen University of Adelaide and NICTA
  • Zhihui Hao Beijing Institute of Technology
  • Junae Kim The Australian National University and NICTA

Abstract

We concern the problem of learning a Mahalanobis distance metric for improving nearest neighbor classification. Our work is built upon the large margin nearest neighbor (LMNN) classification framework. Due to the semidefiniteness constraint in the optimization problem of LMNN, it is not scalable in terms of the dimensionality of the input data. The original LMNN solver partially alleviates this problem by adopting alternating projection methods instead of standard interior-point methods. Still, at each iteration, the computation complexity is at least O(D3) (D is the dimension of input data). In this work, we propose a column generation based algorithm to solve the LMNN optimization problem much more efficiently. Our algorithm is much more scalable in tha tat each iteration, it does not need full eigen-decomposition. Instead, we only need to find the leading eigen value and its corresponding eigen vector, which is of O(D2) complexity. Experiments show the efficiency and efficacy of our algorithms.

Downloads

Published

2011-08-04

How to Cite

Park, K., Shen, C., Hao, Z., & Kim, J. (2011). Efficiently Learning a Distance Metric for Large Margin Nearest Neighbor Classification. Proceedings of the AAAI Conference on Artificial Intelligence, 25(1), 453-458. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/7904

Issue

Section

AAAI Technical Track: Machine Learning