all AI news
In-situ animal behavior classification using knowledge distillation and fixed-point quantization. (arXiv:2209.04130v1 [cs.LG])
cs.LG updates on arXiv.org arxiv.org
We explore the use of knowledge distillation (KD) for learning compact and
accurate models that enable classification of animal behavior from
accelerometry data on wearable devices. To this end, we take a deep and complex
convolutional neural network, known as residual neural network (ResNet), as the
teacher model. ResNet is specifically designed for multivariate time-series
classification. We use ResNet to distil the knowledge of animal behavior
classification datasets into soft labels, which consist of the predicted
pseudo-probabilities of every class …
arxiv behavior classification distillation fixed-point knowledge quantization