Metric Learning from Relative Comparisons By Minimizing Squared Residual


Recent studies [1]-[5] have suggested using constraints in the form of relative distance comparisons to represent domain knowledge: d(a, b) < d(c, d) where d(·) is the distance function and a, b, c, d are data objects. Such constraints are readily available in many problems where pairwise constraints are not natural to obtain. In this paper we consider the problem of learning a Mahalanobis distance metric from supervision in the form of relative distance comparisons. We propose a simple, yet effective, algorithm that minimizes a convex objective function corresponding to the sum of squared residuals of constraints. We also extend our model and algorithm to promote sparsity in the learned metric matrix. Experimental results suggest that our method consistently outperforms existing methods in terms of clustering accuracy. Furthermore, the sparsity extension leads to more stable estimation when the dimension is high and only a small amount of supervision is given.

Meeting Name

12th IEEE International Conference on Data Mining, ICDM (2012: Dec. 10-13, Brussels, Belgium)


Computer Science

Keywords and Phrases

Clustering Accuracy; Convex Objectives; Data Objects; Distance Functions; Domain Knowledge; Mahalanobis Distances; Mahalanobis Metric; Metric Learning; Metric Matrix; Model And Algorithms; Pairwise Constraints; Relative Comparisons; Relative Distances

International Standard Book Number (ISBN)


International Standard Serial Number (ISSN)


Document Type

Article - Conference proceedings

Document Version


File Type





© 2012 Institute of Electrical and Electronics Engineers (IEEE), All rights reserved.

Publication Date

01 Dec 2012