all AI news
A Theoretical Analysis on Independence-driven Importance Weighting for Covariate-shift Generalization. (arXiv:2111.02355v2 [cs.LG] UPDATED)
Web: http://arxiv.org/abs/2111.02355
June 20, 2022, 1:12 a.m. | Renzhe Xu, Xingxuan Zhang, Zheyan Shen, Tong Zhang, Peng Cui
stat.ML updates on arXiv.org arxiv.org
Covariate-shift generalization, a typical case in out-of-distribution (OOD)
generalization, requires a good performance on the unknown test distribution,
which varies from the accessible training distribution in the form of covariate
shift. Recently, independence-driven importance weighting algorithms in stable
learning literature have shown empirical effectiveness to deal with
covariate-shift generalization on several learning models, including regression
algorithms and deep neural networks, while their theoretical analyses are
missing. In this paper, we theoretically prove the effectiveness of such
algorithms by explaining them …
More from arxiv.org / stat.ML updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY