all AI news
Conditional entropy minimization principle for learning domain invariant representation features. (arXiv:2201.10460v1 [cs.LG])
Web: http://arxiv.org/abs/2201.10460
Jan. 26, 2022, 2:11 a.m. | Thuan Nguyen, Boyang Lyu, Prakash Ishwar, Matthias Scheutz, Shuchin Aeron
cs.LG updates on arXiv.org arxiv.org
Invariance principle-based methods, for example, Invariant Risk Minimization
(IRM), have recently emerged as promising approaches for Domain Generalization
(DG). Despite the promising theory, invariance principle-based approaches fail
in common classification tasks due to the mixture of the true invariant
features and the spurious invariant features. In this paper, we propose a
framework based on the conditional entropy minimization principle to filter out
the spurious invariant features leading to a new algorithm with a better
generalization capability. We theoretically prove that …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Product Manager (Europe, Remote)
@ FreshBooks | Germany
Field Operations and Data Engineer, ADAS
@ Lucid Motors | Newark, CA
Machine Learning Engineer - Senior
@ Novetta | Reston, VA
Analytics Engineer
@ ThirdLove | Remote
Senior Machine Learning Infrastructure Engineer - Safety
@ Discord | San Francisco, CA or Remote
Internship, Data Scientist
@ Everstream Analytics | United States (Remote)