April 10, 2024, 4:43 a.m. | Han-Byul Kim, Joo Hyung Lee, Sungjoo Yoo, Hong-Seok Kim

cs.LG updates on arXiv.org arxiv.org

arXiv:2311.06798v2 Announce Type: replace
Abstract: Mixed-precision quantization of efficient networks often suffer from activation instability encountered in the exploration of bit selections. To address this problem, we propose a novel method called MetaMix which consists of bit selection and weight training phases. The bit selection phase iterates two steps, (1) the mixed-precision-aware weight update, and (2) the bit-search training with the fixed mixed-precision-aware weights, both of which combined reduce activation instability in mixed-precision quantization and contribute to fast and high-quality …

abstract arxiv cs.cv cs.lg exploration meta mixed mixed-precision networks novel precision quantization state training type

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain