all AI news
Towards a theory of out-of-distribution learning. (arXiv:2109.14501v4 [stat.ML] UPDATED)
Jan. 7, 2022, 2:10 a.m. | Ali Geisa, Ronak Mehta, Hayden S. Helm, Jayanta Dey, Eric Eaton, Jeffery Dick, Carey E. Priebe, Joshua T. Vogelstein
cs.LG updates on arXiv.org arxiv.org
What is learning? 20$^{st}$ century formalizations of learning theory --
which precipitated revolutions in artificial intelligence -- focus primarily on
$\mathit{in-distribution}$ learning, that is, learning under the assumption
that the training data are sampled from the same distribution as the evaluation
distribution. This assumption renders these theories inadequate for
characterizing 21$^{st}$ century real world data problems, which are typically
characterized by evaluation distributions that differ from the training data
distributions (referred to as out-of-distribution learning). We therefore make
a small …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Machine Learning Engineer (m/f/d)
@ StepStone Group | Düsseldorf, Germany
2024 GDIA AI/ML Scientist - Supplemental
@ Ford Motor Company | United States