Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published August 2020 | Published + Submitted
Journal Article Open

Conditional Linear Regression

Abstract

Work in machine learning and statistics commonly focuses on building models that capture the vast majority of data, possibly ignoring a segment of the population as outliers. However, there may not exist a good, simple model for the distribution, so we seek to find a small subset where there exists such a model. We give a computationally efficient algorithm with theoretical analysis for the conditional linear regression task, which is the joint task of identifying a significant portion of the data distribution, described by a k-DNF, along with a linear predictor on that portion with a small loss. In contrast to work in robust statistics on small subsets, our loss bounds do not feature a dependence on the density of the portion we fit, and compared to previous work on conditional linear regression, our algorithm's running time scales polynomially with the sparsity of the linear predictor. We also demonstrate empirically that our algorithm can leverage this advantage to obtain a k-DNF with a better linear predictor in practice.

Additional Information

© 2020 by the author(s). Brendan Juba was supported by an AFOSR Young Investigator Award and NSF award CCF-1718380; part of this work was performed while visiting the Simons Institute for Theory of Computing. Part of this work was performed as an REU at Washington University in St. Louis, when Diego Calderon was supported by WUSEF and Lisa Ruan was supported by the NSF Big Data Analytics REU Site, award IIS-1560191.

Attached Files

Published - calderon20a.pdf

Submitted - 1806.02326.pdf

Files

calderon20a.pdf
Files (5.1 MB)
Name Size Download all
md5:df92e58fd57c591e41bc84bdcbb01c5b
422.6 kB Preview Download
md5:14c4956642c4d1ca7b717fafcb2fe987
4.7 MB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 20, 2023