Web: http://arxiv.org/abs/2112.09796

May 12, 2022, 1:10 a.m. | Niklas Smedemark-Margulies, Ye Wang, Toshiaki Koike-Akino, Deniz Erdogmus

stat.ML updates on arXiv.org arxiv.org

We provide a regularization framework for subject transfer learning in which
we seek to train an encoder and classifier to minimize classification loss,
subject to a penalty measuring independence between the latent representation
and the subject label. We introduce three notions of independence and
corresponding penalty terms using mutual information or divergence as a proxy
for independence. For each penalty term, we provide several concrete estimation
algorithms, using analytic methods as well as neural critic functions. We
provide a hands-off …

arxiv data learning on transfer transfer learning

More from arxiv.org / stat.ML updates on arXiv.org

Data Analyst, Patagonia Action Works

@ Patagonia | Remote

Data & Insights Strategy & Innovation General Manager

@ Chevron Services Company, a division of Chevron U.S.A Inc. | Houston, TX

Faculty members in Research areas such as Bayesian and Spatial Statistics; Data Privacy and Security; AI/ML; NLP; Image and Video Data Analysis

@ Ahmedabad University | Ahmedabad, India

Director, Applied Mathematics & Computational Research Division

@ Lawrence Berkeley National Lab | Berkeley, Ca

Business Data Analyst

@ MainStreet Family Care | Birmingham, AL

Assistant/Associate Professor of the Practice in Business Analytics

@ Georgetown University McDonough School of Business | Washington DC