Web: https://www.reddit.com/r/MachineLearning/comments/xevqau/d_pretrained_networks_and_batch_normalization/

Sept. 15, 2022, 12:35 p.m. | /u/RaptorDotCpp

Machine Learning reddit.com

When we take a pre-trained network, e.g., ResNet50 on ImageNet, and want to apply it to a new dataset, what we typically do is:

1. Freeze the backbone, but keep the classifier trainable
2. Train until convergence
3. Unfreeze the backbone and train with a low learning rate until convergence

However, I noticed that when we freeze a network with batch normalization layers, the following parameters are still being updated because the batch normalization layers are in training mode: `running_mean`, …

machinelearning networks normalization

Research Scientists

@ ODU Research Foundation | Norfolk, Virginia

Embedded Systems Engineer (Robotics)

@ Neo Cybernetica | Bedford, New Hampshire

2023 Luis J. Alvarez and Admiral Grace M. Hopper Postdoc Fellowship in Computing Sciences

@ Lawrence Berkeley National Lab | San Francisco, CA

Senior Manager Data Scientist

@ NAV | Remote, US

Senior AI Research Scientist

@ Earth Species Project | Remote anywhere

Research Fellow- Center for Security and Emerging Technology (Multiple Opportunities)

@ University of California Davis | Washington, DC

Staff Fellow - Data Scientist

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Staff Fellow - Senior Data Engineer

@ U.S. FDA/Center for Devices and Radiological Health | Silver Spring, Maryland

Tech Business Data Analyst

@ Fivesky | Alpharetta, GA

Senior Applied Scientist

@ Amazon.com | London, England, GBR

AI Researcher (Junior/Mid-level)

@ Charles River Analytics Inc. | Cambridge, MA

Data Engineer - Machine Learning & AI

@ Calabrio | Minneapolis, Minnesota, United States