all AI news
Training Heterogeneous Client Models using Knowledge Distillation in Serverless Federated Learning
Feb. 13, 2024, 5:42 a.m. | Mohak Chadha Pulkit Khera Jianfeng Gu Osama Abboud Michael Gerndt
cs.LG updates on arXiv.org arxiv.org
as-a-service client collaborative computing cs.ai cs.dc cs.lg data decentralized designing distillation distributed federated learning function global knowledge machine machine learning paradigm serverless serverless computing service systems technologies training
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Data Engineer (m/f/d)
@ Project A Ventures | Berlin, Germany
Principle Research Scientist
@ Analog Devices | US, MA, Boston