all AI news
AdaptSFL: Adaptive Split Federated Learning in Resource-constrained Edge Networks
March 21, 2024, 4:41 a.m. | Zheng Lin, Guanqiao Qu, Wei Wei, Xianhao Chen, Kin K. Leung
cs.LG updates on arXiv.org arxiv.org
Abstract: The increasing complexity of deep neural networks poses significant barriers to democratizing them to resource-limited edge devices. To address this challenge, split federated learning (SFL) has emerged as a promising solution by of floading the primary training workload to a server via model partitioning while enabling parallel training among edge devices. However, although system optimization substantially influences the performance of SFL under resource-constrained systems, the problem remains largely uncharted. In this paper, we provide a …
abstract arxiv challenge complexity cs.ai cs.dc cs.lg devices edge edge devices edge networks enabling federated learning networks neural networks partitioning server solution them training type via
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Business Data Scientist, gTech Ads
@ Google | Mexico City, CDMX, Mexico
Lead, Data Analytics Operations
@ Zocdoc | Pune, Maharashtra, India