all AI news
Higher degree sum-of-squares relaxations robust against oblivious outliers. (arXiv:2211.07327v1 [cs.LG])
Nov. 15, 2022, 2:13 a.m. | Tommaso d'Orsi, Rajai Nasser, Gleb Novikov, David Steurer
stat.ML updates on arXiv.org arxiv.org
We consider estimation models of the form $Y=X^*+N$, where $X^*$ is some
$m$-dimensional signal we wish to recover, and $N$ is symmetrically distributed
noise that may be unbounded in all but a small $\alpha$ fraction of the
entries. We introduce a family of algorithms that under mild assumptions
recover the signal $X^*$ in all estimation problems for which there exists a
sum-of-squares algorithm that succeeds in recovering the signal $X^*$ when the
noise $N$ is Gaussian. This essentially shows that …
More from arxiv.org / stat.ML updates on arXiv.org
Learning linear dynamical systems under convex constraints
1 day, 23 hours ago |
arxiv.org
Inverse Unscented Kalman Filter
2 days, 23 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne