April 15, 2024, 4:43 a.m. | Shensheng Zheng, Wenhao Yuan, Xuehe Wang, Lingjie Duan

cs.LG updates on arXiv.org arxiv.org

arXiv:2303.14966v3 Announce Type: replace-cross
Abstract: Federated Learning (FL) has emerged as a prominent distributed machine learning framework that enables geographically discrete clients to train a global model collaboratively while preserving their privacy-sensitive data. However, due to the non-independent-and-identically-distributed (Non-IID) data generated by heterogeneous clients, the performances of the conventional federated optimization schemes such as FedAvg and its variants deteriorate, requiring the design to adaptively adjust specific model parameters to alleviate the negative influence of heterogeneity. In this paper, by leveraging …

abstract arxiv cs.dc cs.lg data distributed entropy federated learning framework generated global however independent machine machine learning optimization performances privacy train type via

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US