all AI news
Rethinking Centered Kernel Alignment in Knowledge Distillation
Feb. 29, 2024, 5:46 a.m. | Zikai Zhou, Yunhang Shen, Shitong Shao, Linrui Gong, Shaohui Lin
cs.CV updates on arXiv.org arxiv.org
Abstract: Knowledge distillation has emerged as a highly effective method for bridging the representation discrepancy between large-scale models and lightweight models. Prevalent approaches involve leveraging appropriate metrics to minimize the divergence or distance between the knowledge extracted from the teacher model and the knowledge learned by the student model. Centered Kernel Alignment (CKA) is widely used to measure representation similarity and has been applied in several knowledge distillation methods. However, these methods are complex and fail …
abstract alignment arxiv cs.cv distillation divergence kernel knowledge large-scale models metrics representation scale type
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Codec Avatars Research Engineer
@ Meta | Pittsburgh, PA