Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published April 2021 | Supplemental Material + Published + Submitted
Journal Article Open

Active Learning under Label Shift

Abstract

We address the problem of active learning under label shift: when the class proportions of source and target domains differ. We introduce a "medial distribution" to incorporate a tradeoff between importance weighting and class-balanced sampling and propose their combined usage in active learning. Our method is known as Mediated Active Learning under Label Shift (MALLS). It balances the bias from class-balanced sampling and the variance from importance weighting. We prove sample complexity and generalization guarantees for MALLS which show active learning reduces asymptotic sample complexity even under arbitrary label shift. We empirically demonstrate MALLS scales to high-dimensional datasets and can reduce the sample complexity of active learning by 60% in deep active learning tasks.

Additional Information

© 2021 by the author(s). Anqi Liu is supported by the PIMCO Postdoctoral Fellowship. Prof. Anandkumar is supported by Bren endowed Chair, faculty awards from Microsoft, Google, and Adobe, Beyond Limits, and LwLL grants. This work is also supported by funding from Raytheon and NASA TRISH.

Attached Files

Published - zhao21b.pdf

Submitted - 2007.08479.pdf

Supplemental Material - zhao21b-supp.pdf

Files

zhao21b.pdf
Files (8.1 MB)
Name Size Download all
md5:2787d4d05b3f412089abf52bef93ae91
3.0 MB Preview Download
md5:682acef099f854adba1531a3438f0816
4.3 MB Preview Download
md5:171114673d524b299f84233b01975f77
849.9 kB Preview Download

Additional details

Created:
August 20, 2023
Modified:
March 27, 2024