Aug. 31, 2022, 1:10 a.m. | Xiaoxia Wu, Edgar Dobriban, Tongzheng Ren, Shanshan Wu, Zhiyuan Li, Suriya Gunasekar, Rachel Ward, Qiang Liu

cs.LG updates on arXiv.org arxiv.org

Normalization methods such as batch [Ioffe and Szegedy, 2015], weight
[Salimansand Kingma, 2016], instance [Ulyanov et al., 2016], and layer
normalization [Baet al., 2016] have been widely used in modern machine
learning. Here, we study the weight normalization (WN) method [Salimans and
Kingma, 2016] and a variant called reparametrized projected gradient descent
(rPGD) for overparametrized least-squares regression. WN and rPGD reparametrize
the weights with a scale g and a unit vector w and thus the objective function
becomes non-convex. We …

arxiv convergence normalization regularization

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Principal Engineer, Deep Learning

@ Outrider | Remote

Data Analyst (Bangkok based, relocation provided)

@ Agoda | Bangkok (Central World Office)

Data Scientist II

@ MoEngage | Bengaluru

Machine Learning Engineer

@ Sika AG | Welwyn Garden City, United Kingdom