March 5, 2024, 2:45 p.m. | Yuntao Wang, Zhou Su, Yanghe Pan, Tom H Luan, Ruidong Li, Shui Yu

cs.LG updates on arXiv.org arxiv.org

arXiv:2212.13992v2 Announce Type: replace-cross
Abstract: A key feature of federated learning (FL) is to preserve the data privacy of end users. However, there still exist potential privacy leakage in exchanging gradients under FL. As a result, recent research often explores the differential privacy (DP) approaches to add noises to the computing results to address privacy concerns with low overheads, which however degrade the model performance. In this paper, we strike the balance of data privacy and efficiency by utilizing the …

abstract arxiv computing cs.cr cs.lg data data privacy differential differential privacy end users feature federated learning key preservation privacy research results social type

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York