all AI news
Cost-effective Framework for Gradual Domain Adaptation with Multifidelity. (arXiv:2202.04359v3 [stat.ML] UPDATED)
Nov. 11, 2022, 2:13 a.m. | Shogo Sagawa, Hideitsu Hino
stat.ML updates on arXiv.org arxiv.org
In domain adaptation, when there is a large distance between the source and
target domains, the prediction performance will degrade. Gradual domain
adaptation is one of the solutions to such an issue, assuming that we have
access to intermediate domains, which shift gradually from the source to the
target domain. In previous works, it was assumed that the number of samples in
the intermediate domains was sufficiently large; hence, self-training was
possible without the need for labeled data. If the …
More from arxiv.org / stat.ML updates on arXiv.org
Estimation Sample Complexity of a Class of Nonlinear Continuous-time Systems
2 days, 22 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Program Control Data Analyst
@ Ford Motor Company | Mexico
Vice President, Business Intelligence / Data & Analytics
@ AlphaSense | Remote - United States