Oct. 31, 2022, 1:11 a.m. | Chen Dun, Mirian Hipolito, Chris Jermaine, Dimitrios Dimitriadis, Anastasios Kyrillidis

cs.LG updates on arXiv.org arxiv.org

Asynchronous learning protocols have regained attention lately, especially in
the Federated Learning (FL) setup, where slower clients can severely impede the
learning process. Herein, we propose \texttt{AsyncDrop}, a novel asynchronous
FL framework that utilizes dropout regularization to handle device
heterogeneity in distributed settings. Overall, \texttt{AsyncDrop} achieves
better performance compared to state of the art asynchronous methodologies,
while resulting in less communication and training time overheads. The key idea
revolves around creating ``submodels'' out of the global model, and
distributing their …

arxiv asynchronous distributed dropout federated learning light

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne