Aug. 19, 2022, 1:11 a.m. | David Holzmüller, Viktor Zaverkin, Johannes Kästner, Ingo Steinwart

stat.ML updates on arXiv.org arxiv.org

The acquisition of labels for supervised learning can be expensive. In order
to improve the sample-efficiency of neural network regression, we study active
learning methods that adaptively select batches of unlabeled data for labeling.
We present a framework for constructing such methods out of (network-dependent)
base kernels, kernel transformations and selection methods. Our framework
encompasses many existing Bayesian methods based on Gaussian Process
approximations of neural networks as well as non-Bayesian methods.
Additionally, we propose to replace the commonly used …

active learning arxiv benchmark framework learning ml regression

Data Scientist (m/f/x/d)

@ Symanto Research GmbH & Co. KG | Spain, Germany

Data Analyst, Tableau

@ NTT DATA | Bengaluru, KA, IN

Junior Machine Learning Researcher

@ Weill Cornell Medicine | Doha, QA, 24144

Marketing Data Analytics Intern

@ Sloan | Franklin Park, IL, US, 60131

Senior Machine Learning Scientist

@ Adyen | Amsterdam

Data Engineer

@ Craft.co | Warsaw, Mazowieckie