TY - JOUR
AU - Liu, Wei
AU - Mu, Cun
AU - Ji, Rongrong
AU - Ma, Shiqian
AU - Smith, John
AU - Chang, Shih-Fu
PY - 2015/02/21
Y2 - 2022/08/19
TI - Low-Rank Similarity Metric Learning in High Dimensions
JF - Proceedings of the AAAI Conference on Artificial Intelligence
JA - AAAI
VL - 29
IS - 1
SE - Main Track: Novel Machine Learning Algorithms
DO - 10.1609/aaai.v29i1.9639
UR - https://ojs.aaai.org/index.php/AAAI/article/view/9639
SP -
AB - <p> Metric learning has become a widespreadly used tool in machine learning. To reduce expensive costs brought in by increasing dimensionality, low-rank metric learning arises as it can be more economical in storage and computation. However, existing low-rank metric learning algorithms usually adopt nonconvex objectives, and are hence sensitive to the choice of a heuristic low-rank basis. In this paper, we propose a novel low-rank metric learning algorithm to yield bilinear similarity functions. This algorithm scales linearly with input dimensionality in both space and time, therefore applicable to high-dimensional data domains. A convex objective free of heuristics is formulated by leveraging trace norm regularization to promote low-rankness. Crucially, we prove that all globally optimal metric solutions must retain a certain low-rank structure, which enables our algorithm to decompose the high-dimensional learning task into two steps: an SVD-based projection and a metric learning problem with reduced dimensionality. The latter step can be tackled efficiently through employing a linearized Alternating Direction Method of Multipliers. The efficacy of the proposed algorithm is demonstrated through experiments performed on four benchmark datasets with tens of thousands of dimensions. </p>
ER -