all AI news
Optimizing the Deployment of Tiny Transformers on Low-Power MCUs
April 5, 2024, 4:41 a.m. | Victor J. B. Jung, Alessio Burrello, Moritz Scherer, Francesco Conti, Luca Benini
cs.LG updates on arXiv.org arxiv.org
Abstract: Transformer networks are rapidly becoming SotA in many fields, such as NLP and CV. Similarly to CNN, there is a strong push for deploying Transformer models at the extreme edge, ultimately fitting the tiny power budget and memory footprint of MCUs. However, the early approaches in this direction are mostly ad-hoc, platform, and model-specific. This work aims to enable and optimize the flexible, multi-platform deployment of encoder Tiny Transformers on commercial MCUs. We propose a …
abstract arxiv budget cnn cs.ai cs.dc cs.lg cs.pf deployment edge fields however low mcus memory networks nlp power sota transformer transformer models transformers type
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Data Engineer
@ Quantexa | Sydney, New South Wales, Australia
Staff Analytics Engineer
@ Warner Bros. Discovery | NY New York 230 Park Avenue South