all AI news
On the influence of roundoff errors on the convergence of the gradient descent method with low-precision floating-point computation. (arXiv:2202.12276v1 [cs.LG])
Feb. 25, 2022, 2:11 a.m. | Lu Xia, Stefano Massei, Michiel Hochstenbach, Barry Koren
cs.LG updates on arXiv.org arxiv.org
The employment of stochastic rounding schemes helps prevent stagnation of
convergence, due to vanishing gradient effect when implementing the gradient
descent method in low precision. Conventional stochastic rounding achieves zero
bias by preserving small updates with probabilities proportional to their
relative magnitudes. In this study, we propose a new stochastic rounding scheme
that trades the zero bias property with a larger probability to preserve small
gradients. Our method yields a constant rounding bias that, at each iteration,
lies in a …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Senior Business Intelligence Developer / Analyst
@ Transamerica | Work From Home, USA
Data Analyst (All Levels)
@ Noblis | Bethesda, MD, United States