Sept. 12, 2022, 1:11 a.m. | Reza Arablouei, Liang Wang, Caitlin Phillips, Lachlan Currie, Jordan Yates, Greg Bishop-Hurley

cs.LG updates on arXiv.org arxiv.org

We explore the use of knowledge distillation (KD) for learning compact and
accurate models that enable classification of animal behavior from
accelerometry data on wearable devices. To this end, we take a deep and complex
convolutional neural network, known as residual neural network (ResNet), as the
teacher model. ResNet is specifically designed for multivariate time-series
classification. We use ResNet to distil the knowledge of animal behavior
classification datasets into soft labels, which consist of the predicted
pseudo-probabilities of every class …

arxiv behavior classification distillation fixed-point knowledge quantization

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Software Engineer, Generative AI (C++)

@ SoundHound Inc. | Toronto, Canada