July 8, 2023, 12:02 p.m. | Alexey Kravets

Towards AI - Medium pub.towardsai.net

Freezing Layers of a Deep Learning Model — the proper way

ADAM optimizer example in PyTorch

Jason Mitrione on unsplash

Introduction

It is often useful to freeze some of the parameters for example when you are fine-tuning your model and want to freeze some layers depending on the example you process like illustrated

SpotTune: Transfer Learning through Adaptive Fine-tuning

As we can see for the first example we are freezing the first two layers, and updating the parameters of the …

artificial intelligence deep learning gradient-descent machine learning optimization

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Engineer

@ Quantexa | Sydney, New South Wales, Australia

Staff Analytics Engineer

@ Warner Bros. Discovery | NY New York 230 Park Avenue South