all AI news
FedCau: A Proactive Stop Policy for Communication and Computation Efficient Federated Learning
March 27, 2024, 4:43 a.m. | Afsaneh Mahmoudi, Hossein S. Ghadikolaei, Jos\'e Mairton Barros Da Silva J\'unior, Carlo Fischione
cs.LG updates on arXiv.org arxiv.org
Abstract: This paper investigates efficient distributed training of a Federated Learning~(FL) model over a wireless network of wireless devices. The communication iterations of the distributed training algorithm may be substantially deteriorated or even blocked by the effects of the devices' background traffic, packet losses, congestion, or latency. We abstract the communication-computation impacts as an `iteration cost' and propose a cost-aware causal FL algorithm~(FedCau) to tackle this problem. We propose an iteration-termination method that trade-offs the training …
abstract algorithm arxiv communication computation cs.lg devices distributed effects federated learning losses network paper policy traffic training type wireless
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Sr. VBI Developer II
@ Atos | Texas, US, 75093
Wealth Management - Data Analytics Intern/Co-op Fall 2024
@ Scotiabank | Toronto, ON, CA