all AI news
Implicit Regularization and Convergence for Weight Normalization. (arXiv:1911.07956v5 [cs.LG] UPDATED)
Aug. 31, 2022, 1:12 a.m. | Xiaoxia Wu, Edgar Dobriban, Tongzheng Ren, Shanshan Wu, Zhiyuan Li, Suriya Gunasekar, Rachel Ward, Qiang Liu
stat.ML updates on arXiv.org arxiv.org
Normalization methods such as batch [Ioffe and Szegedy, 2015], weight
[Salimansand Kingma, 2016], instance [Ulyanov et al., 2016], and layer
normalization [Baet al., 2016] have been widely used in modern machine
learning. Here, we study the weight normalization (WN) method [Salimans and
Kingma, 2016] and a variant called reparametrized projected gradient descent
(rPGD) for overparametrized least-squares regression. WN and rPGD reparametrize
the weights with a scale g and a unit vector w and thus the objective function
becomes non-convex. We …
More from arxiv.org / stat.ML updates on arXiv.org
Jobs in AI, ML, Big Data
Data Scientist (m/f/x/d)
@ Symanto Research GmbH & Co. KG | Spain, Germany
Data Engineer
@ Bosch Group | San Luis Potosí, Mexico
DATA Engineer (H/F)
@ Renault Group | FR REN RSAS - Le Plessis-Robinson (Siège)
Advisor, Data engineering
@ Desjardins | 1, Complexe Desjardins, Montréal
Data Engineer Intern
@ Getinge | Wayne, NJ, US
Software Engineer III- Java / Python / Pyspark / ETL
@ JPMorgan Chase & Co. | Jersey City, NJ, United States