all AI news
Federated Learning Using Three-Operator ADMM
March 27, 2024, 4:43 a.m. | Shashi Kant, Jos\'e Mairton B. da Silva Jr., Gabor Fodor, Bo G\"oransson, Mats Bengtsson, Carlo Fischione
cs.LG updates on arXiv.org arxiv.org
Abstract: Federated learning (FL) has emerged as an instance of distributed machine learning paradigm that avoids the transmission of data generated on the users' side. Although data are not transmitted, edge devices have to deal with limited communication bandwidths, data heterogeneity, and straggler effects due to the limited computational resources of users' devices. A prominent approach to overcome such difficulties is FedADMM, which is based on the classical two-operator consensus alternating direction method of multipliers (ADMM). …
abstract arxiv communication computational cs.lg data deal devices distributed edge edge devices eess.sp effects federated learning generated instance machine machine learning math.oc paradigm resources type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior ML Engineer
@ Carousell Group | Ho Chi Minh City, Vietnam
Data and Insight Analyst
@ Cotiviti | Remote, United States