May 25, 2022, 1:10 a.m. | Hideya Ochiai, Yuwei Sun, Qingzhe Jin, Nattanon Wongwiwatchai, Hiroshi Esaki

cs.LG updates on arXiv.org arxiv.org

Federated learning has allowed training of a global model by aggregating
local models trained on local nodes. However, it still takes client-server
model, which can be further distributed, fully decentralized, or even partially
connected, or totally opportunistic. In this paper, we propose a wireless ad
hoc federated learning (WAFL) -- a fully distributed cooperative machine
learning organized by the nodes physically nearby. Here, each node has a
wireless interface and can communicate with each other when they are within the …

ad arxiv distributed federated learning learning machine machine learning wireless

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Software Engineer, Generative AI (C++)

@ SoundHound Inc. | Toronto, Canada