all AI news
FedPEAT: Convergence of Federated Learning, Parameter-Efficient Fine Tuning, and Emulator Assisted Tuning for Artificial Intelligence Foundation Models with Mobile Edge Computing
Feb. 29, 2024, 5:43 a.m. | Terence Jie Chua, Wenhan Yu, Jun Zhao, Kwok-Yan Lam
cs.LG updates on arXiv.org arxiv.org
Abstract: The emergence of foundation models, including language and vision models, has reshaped AI's landscape, offering capabilities across various applications. Deploying and fine-tuning these large models, like GPT-3 and BERT, presents challenges, especially in the current foundation model era. We introduce Emulator-Assisted Tuning (EAT) combined with Parameter-Efficient Fine-Tuning (PEFT) to form Parameter-Efficient Emulator-Assisted Tuning (PEAT). Further, we expand this into federated learning as Federated PEAT (FedPEAT). FedPEAT uses adapters, emulators, and PEFT for federated model tuning, …
abstract applications artificial artificial intelligence arxiv bert capabilities challenges computing convergence cs.lg cs.ni edge edge computing emergence emulator federated learning fine-tuning foundation gpt gpt-3 intelligence landscape language large models mobile mobile edge computing type vision vision models
More from arxiv.org / cs.LG updates on arXiv.org
Sliced Wasserstein with Random-Path Projecting Directions
1 day, 10 hours ago |
arxiv.org
Learning Extrinsic Dexterity with Parameterized Manipulation Primitives
1 day, 10 hours ago |
arxiv.org
The Un-Kidnappable Robot: Acoustic Localization of Sneaking People
1 day, 10 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Engineer
@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York