March 26, 2024, 4:41 a.m. | Arash Badie-Modiri, Chiara Boldrini, Lorenzo Valerio, J\'anos Kert\'esz, M\'arton Karsai

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.15855v1 Announce Type: new
Abstract: Fully decentralised federated learning enables collaborative training of individual machine learning models on distributed devices on a network while keeping the training data localised. This approach enhances data privacy and eliminates both the single point of failure and the necessity for central coordination. Our research highlights that the effectiveness of decentralised federated learning is significantly influenced by the network topology of connected devices. A simplified numerical model for studying the early behaviour of these systems …

abstract arxiv collaborative cs.ai cs.dc cs.lg data data privacy decentralised devices distributed effects failure federated learning highlights machine machine learning machine learning models network physics.soc-ph privacy research topology training training data type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Software Engineering Manager, Generative AI - Characters

@ Meta | Bellevue, WA | Menlo Park, CA | Seattle, WA | New York City | San Francisco, CA

Senior Operations Research Analyst / Predictive Modeler

@ LinQuest | Colorado Springs, Colorado, United States