April 4, 2024, 4:41 a.m. | Jaeyoung Song, Sang-Woon Jeon

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.02395v1 Announce Type: new
Abstract: Federated learning aims to construct a global model that fits the dataset distributed across local devices without direct access to private data, leveraging communication between a server and the local devices. In the context of a practical communication scheme, we study the completion time required to achieve a target performance. Specifically, we analyze the number of iterations required for federated learning to reach a specific optimality gap from a minimum global loss. Subsequently, we characterize …

abstract arxiv communication construct context cs.dc cs.lg data dataset devices distributed federated learning global practical private data server study type wireless

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne