Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published December 2019 | Submitted + Published + Supplemental Material
Book Section - Chapter Open

Landmark Ordinal Embedding

Abstract

In this paper, we aim to learn a low-dimensional Euclidean representation from a set of constraints of the form "item j is closer to item i than item k". Existing approaches for this "ordinal embedding" problem require expensive optimization procedures, which cannot scale to handle increasingly larger datasets. To address this issue, we propose a landmark-based strategy, which we call Landmark Ordinal Embedding (LOE). Our approach trades off statistical efficiency for computational efficiency by exploiting the low-dimensionality of the latent embedding. We derive bounds establishing the statistical consistency of LOE under the popular Bradley- Terry-Luce noise model. Through a rigorous analysis of the computational complexity, we show that LOE is significantly more efficient than conventional ordinal embedding approaches as the number of items grows. We validate these characterizations empirically on both synthetic and real datasets. We also present a practical approach that achieves the "best of both worlds", by using LOE to warm-start existing methods that are more statistically efficient but computationally expensive.

Additional Information

© 2019 Neural Information Processing Systems Foundation, Inc. Nikhil Ghosh was supported in part by a Caltech Summer Undergraduate Research Fellowship. Yuxin Chen was supported in part by a Swiss NSF Early Mobility Postdoctoral Fellowship. This work was also supported in part by gifts from PIMCO and Bloomberg.

Attached Files

Published - 9326-landmark-ordinal-embedding.pdf

Submitted - 1910.12379.pdf

Supplemental Material - 9326-landmark-ordinal-embedding-supplemental.zip

Files

1910.12379.pdf
Files (7.3 MB)
Name Size Download all
md5:3ccaabf8d6f38088e40224a336b3f9c7
3.4 MB Preview Download
md5:34e630fa0d0dd08c65aaf754f44318cf
773.6 kB Preview Download
md5:e48aa64278494c99507345b60d623cf9
3.1 MB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 18, 2023