Feb. 20, 2024, 5:44 a.m. | Zikai Zhou, Shuo Zhang, Ziruo Wang, Huanran Chen

cs.LG updates on arXiv.org arxiv.org

arXiv:2308.03321v4 Announce Type: replace
Abstract: The success of deep learning is inseparable from normalization layers. Researchers have proposed various normalization functions, and each of them has both advantages and disadvantages. In response, efforts have been made to design a unified normalization function that combines all normalization procedures and mitigates their weaknesses. We also proposed a new normalization function called Adaptive Fusion Normalization. Through experiments, we demonstrate AFN outperforms the previous normalization techniques in domain generalization and image classification tasks.

abstract advantages arxiv cs.cv cs.lg decoder deep learning design disadvantages encoder encoder-decoder framework function functions fusion normalization researchers success them type via

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Risk Management - Machine Learning and Model Delivery Services, Product Associate - Senior Associate-

@ JPMorgan Chase & Co. | Wilmington, DE, United States

Senior ML Engineer (Speech/ASR)

@ ObserveAI | Bengaluru