Web: http://arxiv.org/abs/2206.08220

June 17, 2022, 1:12 a.m. | Alex Lambert, Dimitri Bouche, Zoltan Szabo, Florence d'Alché-Buc

stat.ML updates on arXiv.org arxiv.org

The focus of the paper is functional output regression (FOR) with convoluted
losses. While most existing work consider the square loss setting, we leverage
extensions of the Huber and the $\epsilon$-insensitive loss (induced by infimal
convolution) and propose a flexible framework capable of handling various forms
of outliers and sparsity in the FOR family. We derive computationally tractable
algorithms relying on duality to tackle the resulting tasks in the context of
vector-valued reproducing kernel Hilbert spaces. The efficiency of the …

arxiv convolution losses ml regression

More from arxiv.org / stat.ML updates on arXiv.org

Machine Learning Researcher - Saalfeld Lab

@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia

Project Director, Machine Learning in US Health

@ ideas42.org | Remote, US

Data Science Intern

@ NannyML | Remote

Machine Learning Engineer NLP/Speech

@ Play.ht | Remote

Research Scientist, 3D Reconstruction

@ Yembo | Remote, US

Clinical Assistant or Associate Professor of Management Science and Systems

@ University at Buffalo | Buffalo, NY