all AI news
Federated Transfer Learning with Differential Privacy
March 19, 2024, 4:41 a.m. | Mengchu Li, Ye Tian, Yang Feng, Yi Yu
cs.LG updates on arXiv.org arxiv.org
Abstract: Federated learning is gaining increasing popularity, with data heterogeneity and privacy being two prominent challenges. In this paper, we address both issues within a federated transfer learning framework, aiming to enhance learning on a target data set by leveraging information from multiple heterogeneous source data sets while adhering to privacy constraints. We rigorously formulate the notion of \textit{federated differential privacy}, which offers privacy guarantees for each data set without assuming a trusted central server. Under …
abstract arxiv challenges cs.cr cs.lg data data set data sets differential differential privacy federated learning framework information math.st multiple paper privacy set source data stat.me stat.ml stat.th transfer transfer learning type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
AI Engineering Manager
@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain