Fixed-Rank Supervised Metric Learning on Riemannian Manifold

Authors

  • Yadong Mu AT&T Labs

DOI:

https://doi.org/10.1609/aaai.v30i1.10246

Abstract

Metric learning has become a critical tool in many machine learning tasks. This paper focuses on learning an optimal Mahalanobis distance matrix (parameterized by a positive semi-definite matrix W) in the setting of supervised learning. Recently, particular research attention has been attracted by low-rank metric learning, which requires that matrix W is dominated by a few large singular values. In the era of high feature dimensions, low-rank metric learning effectively reduces the storage and computation overheads. However, existing low-rank metric learning algorithms usually adopt sophisticated regularization (such as LogDet divergence) for encouraging matrix low-rankness, which unfortunately incur iterative computations of matrix SVD. In this paper, we tackle low-rank metric learning by enforcing fixed-rank constraint on the matrix W. We harness the Riemannian manifold geometry of the collection of fixed-rank matrices and devise a novel second-order Riemannian retraction operator. The proposed operator is efficient and ensures that W always resides on the manifold. Comprehensive numerical experiments conducted on benchmarks clearly suggest that the proposed algorithm is substantially superior or on par with the state-of-the-art in terms of k-NN classification accuracy. Moreover, the proposed manifold retraction operator can be also naturally applied in generic rank-constrained machine learning algorithms.

Downloads

Published

2016-02-21

How to Cite

Mu, Y. (2016). Fixed-Rank Supervised Metric Learning on Riemannian Manifold. Proceedings of the AAAI Conference on Artificial Intelligence, 30(1). https://doi.org/10.1609/aaai.v30i1.10246

Issue

Section

Technical Papers: Machine Learning Methods