all AI news
$L_2$BN: Enhancing Batch Normalization by Equalizing the $L_2$ Norms of Features. (arXiv:2207.02625v3 [cs.CV] UPDATED)
Sept. 2, 2022, 1:15 a.m. | Zhennan Wang, Kehan Li, Runyi Yu, Yian Zhao, Pengchong Qiao, Guoli Song, Fan Xu, Jie Chen
cs.CV updates on arXiv.org arxiv.org
In this paper, we show that the difference in $l_2$ norms of sample features
can hinder batch normalization from obtaining more distinguished inter-class
features and more compact intra-class features. To address this issue, we
propose an intuitive but effective method to equalize the $l_2$ norms of sample
features. Concretely, we $l_2$-normalize each sample feature before batch
normalization, and therefore the features are of the same magnitude. Since the
proposed method combines the $l_2$ normalization and batch normalization, we
name our …
More from arxiv.org / cs.CV updates on arXiv.org
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Stagista Technical Data Engineer
@ Hager Group | BRESCIA, IT
Data Analytics - SAS, SQL - Associate
@ JPMorgan Chase & Co. | Mumbai, Maharashtra, India