Feb. 23, 2024, 5:42 a.m. | Francesco Malandrino, Giuseppe Di Giacomo, Marco Levorato, Carla Fabiana Chiasserini

cs.LG updates on arXiv.org arxiv.org

arXiv:2402.14346v1 Announce Type: new
Abstract: The existing work on the distributed training of machine learning (ML) models has consistently overlooked the distribution of the achieved learning quality, focusing instead on its average value. This leads to a poor dependability}of the resulting ML models, whose performance may be much worse than expected. We fill this gap by proposing DepL, a framework for dependable learning orchestration, able to make high-quality, efficient decisions on (i) the data to leverage for learning, (ii) the …

abstract arxiv cs.ai cs.lg distributed distribution leads machine machine learning machine learning models ml models performance quality training type value work

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

RL Analytics - Content, Data Science Manager

@ Meta | Burlingame, CA

Research Engineer

@ BASF | Houston, TX, US, 77079