April 24, 2024, 4:42 a.m. | Bingnan Xiao, Jingjing Zhang, Wei Ni, Xin Wang

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.14811v1 Announce Type: cross
Abstract: Wireless federated learning (WFL) suffers from heterogeneity prevailing in the data distributions, computing powers, and channel conditions of participating devices. This paper presents a new Federated Learning with Adjusted leaRning ratE (FLARE) framework to mitigate the impact of the heterogeneity. The key idea is to allow the participating devices to adjust their individual learning rates and local training iterations, adapting to their instantaneous computing powers. The convergence upper bound of FLARE is established rigorously under …

abstract arxiv computing cs.lg data devices eess.sp federated learning framework impact networks paper rate type wireless

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne