Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published March 28, 2019 | Submitted
Conference Paper Open

Regularized Learning for Domain Adaptation under Label Shifts

Abstract

We propose Regularized Learning under Label shifts (RLLS), a principled and a practical domain-adaptation algorithm to correct for shifts in the label distribution between a source and a target domain. We first estimate importance weights using labeled source data and unlabeled target data, and then train a classifier on the weighted source samples. We derive a generalization bound for the classifier on the target domain which is independent of the (ambient) data dimensions, and instead only depends on the complexity of the function class. To the best of our knowledge, this is the first generalization bound for the label-shift problem where the labels in the target domain are not available. Based on this bound, we propose a regularized estimator for the small-sample regime which accounts for the uncertainty in the estimated weights. Experiments on the CIFAR-10 and MNIST datasets show that RLLS improves classification accuracy, especially in the low sample and large-shift regimes, compared to previous methods.

Additional Information

K. Azizzadenesheli is supported in part by NSF Career Award CCF-1254106 and Air Force FA9550-15-1-0221. This research has been conducted when the first author was a visiting researcher at Caltech. Anqi Liu is supported in part by DOLCIT Postdoctoral Fellowship at Caltech and Caltech's Center for Autonomous Systems and Technologies. Fan Yang is supported by the Institute for Theoretical Studies ETH Zurich and the Dr. Max Rössler and the Walter Haefner Foundation. A. Anandkumar is supported in part by Microsoft Faculty Fellowship, Google faculty award, Adobe grant, NSF Career Award CCF-1254106, and AFOSR YIP FA9550-15-1-0221.

Attached Files

Submitted - 1903.09734.pdf

Files

1903.09734.pdf
Files (1.6 MB)
Name Size Download all
md5:407f3bb3d1169aec5f57d3bbf8fe7d34
1.6 MB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 20, 2023