Nov. 8, 2022, 2:13 a.m. | Louis Leconte, Sholom Schechtman, Eric Moulines

stat.ML updates on arXiv.org arxiv.org

In this paper, we develop a new algorithm, Annealed Skewed SGD - AskewSGD -
for training deep neural networks (DNNs) with quantized weights. First, we
formulate the training of quantized neural networks (QNNs) as a smoothed
sequence of interval-constrained optimization problems. Then, we propose a new
first-order stochastic method, AskewSGD, to solve each constrained optimization
subproblem. Unlike algorithms with active sets and feasible directions,
AskewSGD avoids projections or optimization under the entire feasible set and
allows iterates that are infeasible. …

arxiv interval networks neural networks optimisation

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Codec Avatars Research Engineer

@ Meta | Pittsburgh, PA