all AI news
Is Integer Arithmetic Enough for Deep Learning Training?. (arXiv:2207.08822v2 [cs.LG] UPDATED)
Oct. 11, 2022, 1:14 a.m. | Alireza Ghaffari, Marzieh S. Tahaei, Mohammadreza Tayaranian, Masoud Asgharian, Vahid Partovi Nia
cs.LG updates on arXiv.org arxiv.org
The ever-increasing computational complexity of deep learning models makes
their training and deployment difficult on various cloud and edge platforms.
Replacing floating-point arithmetic with low-bit integer arithmetic is a
promising approach to save energy, memory footprint, and latency of deep
learning models. As such, quantization has attracted the attention of
researchers in recent years. However, using integer numbers to form a fully
functional integer training pipeline including forward pass, back-propagation,
and stochastic gradient descent is not studied in detail. Our …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Data Analyst (Digital Business Analyst)
@ Activate Interactive Pte Ltd | Singapore, Central Singapore, Singapore