Feb. 5, 2024, 3:42 p.m. | Liping Yi Han Yu Chao Ren Heng Zhang Gang Wang Xiaoguang Liu Xiaoxiao Li

cs.LG updates on arXiv.org arxiv.org

Federated learning (FL) is widely employed for collaborative training on decentralized data but faces challenges like data, system, and model heterogeneity. This prompted the emergency of model-heterogeneous personalized federated learning (MHPFL). However, concerns persist regarding data and model privacy, model performance, communication, and computational costs in current MHPFL methods. To tackle these concerns, we propose a novel model-heterogeneous personalized Federated learning algorithm (FedMoE) with the Mixture of Experts (MoE), renowned for enhancing large language models (LLMs). It assigns a shared …

challenges collaborative communication computational concerns costs cs.dc cs.lg current data decentralized decentralized data emergency experts federated learning mixture of experts performance personalization personalized privacy training

Senior Machine Learning Engineer

@ GPTZero | Toronto, Canada

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Data Architect

@ S&P Global | IN - HYDERABAD SKYVIEW

Data Architect I

@ S&P Global | US - VA - CHARLOTTESVILLE 212 7TH STREET