June 10, 2024, 4:44 a.m. | Victor Letzelter (LTCI, S2A, IDS, IP Paris), David Perera (LTCI, S2A, IDS, IP Paris), C\'edric Rommel (LTCI, S2A, IDS, IP Paris), Mathieu Fontaine (LT

stat.ML updates on arXiv.org arxiv.org

arXiv:2406.04706v1 Announce Type: cross
Abstract: Winner-takes-all training is a simple learning paradigm, which handles ambiguous tasks by predicting a set of plausible hypotheses. Recently, a connection was established between Winner-takes-all training and centroidal Voronoi tessellations, showing that, once trained, hypotheses should quantize optimally the shape of the conditional distribution to predict. However, the best use of these hypotheses for uncertainty quantification is still an open question.In this work, we show how to leverage the appealing geometric properties of the Winner-takes-all …

abstract arxiv cs.lg cs.ne distribution eess.sp geometry however math.pr paradigm plausible set shape simple stat.ml tasks training type voronoi

AI Focused Biochemistry Postdoctoral Fellow

@ Lawrence Berkeley National Lab | Berkeley, CA

Senior Data Engineer

@ Displate | Warsaw

PhD Student AI simulation electric drive (f/m/d)

@ Volkswagen Group | Kassel, DE, 34123

AI Privacy Research Lead

@ Leidos | 6314 Remote/Teleworker US

Senior Platform System Architect, Silicon

@ Google | New Taipei, Banqiao District, New Taipei City, Taiwan

Fabrication Hardware Litho Engineer, Quantum AI

@ Google | Goleta, CA, USA