Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published December 2017 | Published
Book Section - Chapter Open

A Universal Analysis of Large-Scale Regularized Least Squares Solutions

Abstract

A problem that has been of recent interest in statistical inference, machine learning and signal processing is that of understanding the asymptotic behavior of regularized least squares solutions under random measurement matrices (or dictionaries). The Least Absolute Shrinkage and Selection Operator (LASSO or least-squares with ℓ_1 regularization) is perhaps one of the most interesting examples. Precise expressions for the asymptotic performance of LASSO have been obtained for a number of different cases, in particular when the elements of the dictionary matrix are sampled independently from a Gaussian distribution. It has also been empirically observed that the resulting expressions remain valid when the entries of the dictionary matrix are independently sampled from certain non-Gaussian distributions. In this paper, we confirm these observations theoretically when the distribution is sub-Gaussian. We further generalize the previous expressions for a broader family of regularization functions and under milder conditions on the underlying random, possibly non-Gaussian, dictionary matrix. In particular, we establish the universality of the asymptotic statistics (e.g., the average quadratic risk) of LASSO with non-Gaussian dictionaries.

Additional Information

© 2017 Neural Information Processing Systems Foundation, Inc.

Attached Files

Published - 6930-a-universal-analysis-of-large-scale-regularized-least-squares-solutions.pdf

Files

6930-a-universal-analysis-of-large-scale-regularized-least-squares-solutions.pdf

Additional details

Created:
August 19, 2023
Modified:
March 5, 2024