all AI news
LW-FedSSL: Resource-efficient Layer-wise Federated Self-supervised Learning
May 1, 2024, 4:43 a.m. | Ye Lin Tun, Chu Myaet Thwal, Le Quang Huy, Minh N. H. Nguyen, Choong Seon Hong
cs.LG updates on arXiv.org arxiv.org
Abstract: Many studies integrate federated learning (FL) with self-supervised learning (SSL) to take advantage of raw training data distributed across edge devices. However, edge devices often struggle with high computation and communication costs imposed by SSL and FL algorithms. To tackle this hindrance, we propose LW-FedSSL, a layer-wise federated self-supervised learning approach that allows edge devices to incrementally train a single layer of the model at a time. Our LW-FedSSL comprises server-side calibration and representation alignment …
abstract algorithms arxiv communication computation costs cs.ai cs.lg data devices distributed edge edge devices federated learning however layer raw self-supervised learning ssl struggle studies supervised learning training training data type wise
More from arxiv.org / cs.LG updates on arXiv.org
Efficient Data-Driven MPC for Demand Response of Commercial Buildings
2 days, 18 hours ago |
arxiv.org
Testing the Segment Anything Model on radiology data
2 days, 18 hours ago |
arxiv.org
Calorimeter shower superresolution
2 days, 18 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Software Engineer for AI Training Data (School Specific)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Python)
@ G2i Inc | Remote
Software Engineer for AI Training Data (Tier 2)
@ G2i Inc | Remote
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US