March 27, 2024, 4:43 a.m. | Afsaneh Mahmoudi, Hossein S. Ghadikolaei, Jos\'e Mairton Barros Da Silva J\'unior, Carlo Fischione

cs.LG updates on arXiv.org arxiv.org

arXiv:2204.07773v2 Announce Type: replace
Abstract: This paper investigates efficient distributed training of a Federated Learning~(FL) model over a wireless network of wireless devices. The communication iterations of the distributed training algorithm may be substantially deteriorated or even blocked by the effects of the devices' background traffic, packet losses, congestion, or latency. We abstract the communication-computation impacts as an `iteration cost' and propose a cost-aware causal FL algorithm~(FedCau) to tackle this problem. We propose an iteration-termination method that trade-offs the training …

abstract algorithm arxiv communication computation cs.lg devices distributed effects federated learning losses network paper policy traffic training type wireless

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Sr. VBI Developer II

@ Atos | Texas, US, 75093

Wealth Management - Data Analytics Intern/Co-op Fall 2024

@ Scotiabank | Toronto, ON, CA