Oct. 27, 2022, 1:13 a.m. | Eugenio Clerico, George Deligiannidis, Benjamin Guedj, Arnaud Doucet

stat.ML updates on arXiv.org arxiv.org

We establish a disintegrated PAC-Bayesian bound, for classifiers that are
trained via continuous-time (non-stochastic) gradient descent. Contrarily to
what is standard in the PAC-Bayesian setting, our result applies to a training
algorithm that is deterministic, conditioned on a random initialisation,
without requiring any $\textit{de-randomisation}$ step. We provide a broad
discussion of the main features of the bound that we propose, and we study
analytically and empirically its behaviour on linear models, finding promising
results.

arxiv bayes classifiers

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Data Engineer

@ Parker | New York City

Sr. Data Analyst | Home Solutions

@ Three Ships | Raleigh or Charlotte, NC