all AI news
Aligning Logits Generatively for Principled Black-Box Knowledge Distillation
April 2, 2024, 7:44 p.m. | Jing Ma, Xiang Xiang, Ke Wang, Yuchuan Wu, Yongbin Li
cs.LG updates on arXiv.org arxiv.org
Abstract: Black-Box Knowledge Distillation (B2KD) is a formulated problem for cloud-to-edge model compression with invisible data and models hosted on the server. B2KD faces challenges such as limited Internet exchange and edge-cloud disparity of data distributions. In this paper, we formalize a two-step workflow consisting of deprivatization and distillation, and theoretically provide a new optimization direction from logits to cell boundary different from direct logits alignment. With its guidance, we propose a new method Mapping-Emulation KD …
abstract arxiv box challenges cloud compression cs.ai cs.cv cs.lg data distillation edge internet knowledge paper server stat.ml type workflow
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Data Science Analyst
@ Mayo Clinic | AZ, United States
Sr. Data Scientist (Network Engineering)
@ SpaceX | Redmond, WA