Aug. 31, 2022, 1:13 a.m. | Xiaoxia Wu, Edgar Dobriban, Tongzheng Ren, Shanshan Wu, Zhiyuan Li, Suriya Gunasekar, Rachel Ward, Qiang Liu

cs.CV updates on arXiv.org arxiv.org

Normalization methods such as batch [Ioffe and Szegedy, 2015], weight
[Salimansand Kingma, 2016], instance [Ulyanov et al., 2016], and layer
normalization [Baet al., 2016] have been widely used in modern machine
learning. Here, we study the weight normalization (WN) method [Salimans and
Kingma, 2016] and a variant called reparametrized projected gradient descent
(rPGD) for overparametrized least-squares regression. WN and rPGD reparametrize
the weights with a scale g and a unit vector w and thus the objective function
becomes non-convex. We …

arxiv convergence normalization regularization

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Business Intelligence Developer / Analyst

@ Transamerica | Work From Home, USA

Data Analyst (All Levels)

@ Noblis | Bethesda, MD, United States