May 27, 2022, 1:11 a.m. | Vivien Cabannes, Francis Bach, Vianney Perchet, Alessandro Rudi

stat.ML updates on arXiv.org arxiv.org

The workhorse of machine learning is stochastic gradient descent. To access
stochastic gradients, it is common to consider iteratively input/output pairs
of a training dataset. Interestingly, it appears that one does not need full
supervision to access stochastic gradients, which is the main motivation of
this paper. After formalizing the "active labeling" problem, which generalizes
active learning based on partial supervision, we provide a streaming technique
that provably minimizes the ratio of generalization error over number of
samples. We illustrate …

arxiv labeling stochastic streaming

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Alternant Data Engineering

@ Aspire Software | Angers, FR

Senior Software Engineer, Generative AI

@ Google | Dublin, Ireland