Feb. 29, 2024, 5:46 a.m. | Zikai Zhou, Yunhang Shen, Shitong Shao, Linrui Gong, Shaohui Lin

cs.CV updates on arXiv.org arxiv.org

arXiv:2401.11824v2 Announce Type: replace
Abstract: Knowledge distillation has emerged as a highly effective method for bridging the representation discrepancy between large-scale models and lightweight models. Prevalent approaches involve leveraging appropriate metrics to minimize the divergence or distance between the knowledge extracted from the teacher model and the knowledge learned by the student model. Centered Kernel Alignment (CKA) is widely used to measure representation similarity and has been applied in several knowledge distillation methods. However, these methods are complex and fail …

abstract alignment arxiv cs.cv distillation divergence kernel knowledge large-scale models metrics representation scale type

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Codec Avatars Research Engineer

@ Meta | Pittsburgh, PA