all AI news
Global and Local Prompts Cooperation via Optimal Transport for Federated Learning
March 4, 2024, 5:41 a.m. | Hongxia Li, Wei Huang, Jingya Wang, Ye Shi
cs.LG updates on arXiv.org arxiv.org
Abstract: Prompt learning in pretrained visual-language models has shown remarkable flexibility across various downstream tasks. Leveraging its inherent lightweight nature, recent research attempted to integrate the powerful pretrained models into federated learning frameworks to simultaneously reduce communication costs and promote local training on insufficient data. Despite these efforts, current federated prompt learning methods lack specialized designs to systematically address severe data heterogeneities, e.g., data distribution with both label and feature shifts involved. To address this challenge, …
abstract arxiv communication costs cs.ai cs.dc cs.lg data federated learning flexibility frameworks global language language models nature pretrained models promote prompt prompt learning prompts reduce research tasks training transport type via visual
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne