March 26, 2024, 4:41 a.m. | Arash Badie-Modiri, Chiara Boldrini, Lorenzo Valerio, J\'anos Kert\'esz, M\'arton Karsai

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.15855v1 Announce Type: new
Abstract: Fully decentralised federated learning enables collaborative training of individual machine learning models on distributed devices on a network while keeping the training data localised. This approach enhances data privacy and eliminates both the single point of failure and the necessity for central coordination. Our research highlights that the effectiveness of decentralised federated learning is significantly influenced by the network topology of connected devices. A simplified numerical model for studying the early behaviour of these systems …

abstract arxiv collaborative cs.ai cs.dc cs.lg data data privacy decentralised devices distributed effects failure federated learning highlights machine machine learning machine learning models network physics.soc-ph privacy research topology training training data type

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US