Less but Better: Generalization Enhancement of Ordinal Embedding via Distributional Margin


  • Ke Ma Chinese Academy of Sciences
  • Qianqian Xu Chinese Academy of Sciences
  • Zhiyong Yang Chinese Academy of Sciences
  • Xiaochun Cao Chinese Academy of Sciences




In the absence of prior knowledge, ordinal embedding methods obtain new representation for items in a low-dimensional Euclidean space via a set of quadruple-wise comparisons. These ordinal comparisons often come from human annotators, and sufficient comparisons induce the success of classical approaches. However, collecting a large number of labeled data is known as a hard task, and most of the existing work pay little attention to the generalization ability with insufficient samples. Meanwhile, recent progress in large margin theory discloses that rather than just maximizing the minimum margin, both the margin mean and variance, which characterize the margin distribution, are more crucial to the overall generalization performance. To address the issue of insufficient training samples, we propose a margin distribution learning paradigm for ordinal embedding, entitled Distributional Margin based Ordinal Embedding (DMOE). Precisely, we first define the margin for ordinal embedding problem. Secondly, we formulate a concise objective function which avoids maximizing margin mean and minimizing margin variance directly but exhibits the similar effect. Moreover, an Augmented Lagrange Multiplier based algorithm is customized to seek the optimal solution of DMOE effectively. Experimental studies on both simulated and realworld datasets are provided to show the effectiveness of the proposed algorithm.




How to Cite

Ma, K., Xu, Q., Yang, Z., & Cao, X. (2019). Less but Better: Generalization Enhancement of Ordinal Embedding via Distributional Margin. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 2978-2985. https://doi.org/10.1609/aaai.v33i01.33012978



AAAI Technical Track: Knowledge Representation and Reasoning