Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published March 30, 2016 | Accepted Version
Report Open

Random Projection Estimation of Discrete-Choice Models With Large Choice Sets

Abstract

We introduce sparse random projection, an important dimension-reduction tool from machine learning, for the estimation of discrete-choice models with high-dimensional choice sets. Initially, the high-dimensional data are compressed into a lower-dimensional Euclidean space using random projections. Subsequently, estimation proceeds using cyclic monotonicity moment inequalities implied by the multinomial choice model; the estimation procedure is semi-parametric and does not require explicit distributional assumptions to be made regarding the random utility errors. The random projection procedure is justified via the Johnson-Lindenstrauss Lemma: - the pairwise distances between data points are preserved during data compression, which we exploit to show convergence of our estimator. The estimator works well in a computational simulation and in a application to a supermarket scanner dataset.

Additional Information

March 2016. First draft: February 29, 2016. This draft: March 2016. We thank Hiroaki Kaido, Michael Leung, Sergio Montero, and participants at the DATALEAD conference (Paris, November 2015) for helpful comments.

Attached Files

Accepted Version - SSWP_1416.pdf

Files

SSWP_1416.pdf
Files (702.0 kB)
Name Size Download all
md5:ef801dab3ceafb14e1a60f57fd1051e0
702.0 kB Preview Download

Additional details

Created:
August 20, 2023
Modified:
January 13, 2024